Google Geo Assistant (Assistant + Maps)

How might we create a safe and seamless digital assistant experience for drivers?

Above: Screenshot from a ride-along study inside a testing vehicle, where I evaluated the Google Assistant experience within Maps on public roads.

Above: Screenshot from a ride-along study inside a testing vehicle, where I evaluated the Google Assistant experience within Maps on public roads.

Driver distraction remains a leading cause of automobile accidents and fatalities in the United States (NHTSA, 2019). People double their crash risk when they perform visual-manual tasks on their phone while driving (2018 AAA Report).

Yet those statistics don’t deter people from using their phones in the automobile.

Given this, could offering a voice & audio-forward experience save lives – and provide a seamless user experience on the road?

Cars are safer for occupants than ever before. There are also more distractions for drivers than ever before. As smartphone market penetration has gone from 35% in 2011 to 81% in 2019 (Pew Research Center, 2019), phone usage in the car is not going away.

So what could a voice-forward Google Assistant do to help drivers accomplish necessary tasks during active navigation (e.g., adding stops, sharing ETA) while keeping their eyes on the road?

Happy to address particular details about these studies in person. Confidential information will not be provided.

Research Approaches

(Each section below is its own separate study. Happy to address particular details in person.)


IMG_7652.jpg

Ride-along studies to understand natural routines and behaviors during morning commutes

How are digital assistants currently used in the vehicle? What are people’s workarounds to the different kinds of tasks they perform in the vehicle? And what are peoples’ information needs regarding the morning commute before they even step inside?

I sat in the passenger seat and interviewed + observed drivers as they embarked on their morning commutes around the Bay Area.


IMG_1045.jpg

Driving simulator studies to understand how voice interaction affects driving performance

How might particular designs of Assistant dialog or multi-turn conversation affect driver reaction times? Would adding voice actually remove cognitive load? Would adding voice and a conversational experience lead to any perceived improvements in experience?

I asked drivers to complete navigation tasks while administering NHTSA car following tasks, and measured cognitive load using tools like the tactile detection response task (T-DRT) along NASA standard guidelines.


Screen Shot 2019-10-26 at 3.24.08 PM.png

Evaluative studies “in the wild” to further understand driver distraction and potential hiccups

After evaluating the experience inside a driving simulator, we bring back the full context of driving for the Assistant. How does the conversational experience perform during real-world conditions, where drivers must account for traffic, construction sites, pedestrians, and the background noise of everyday life?


Deliverables & Impact

Screen Shot 2019-10-27 at 2.21.12 PM.png

Q1 2019: Public launch for the Geo Assistant within Maps!

 

All the research done above was in service of a safe and seamless Assistant experience for drivers – especially before that public release.

Stakeholders were present with us every step of the way: product managers, interaction & conversation designers, engineers. Some even sat in on our driving simulator and ride-along studies. Here are some of the things I did to ensure that this research made an impact on the actual product and wider organization:

  • Wrote email newsletters and shareouts to wider internal organizations (e.g., beyond Google Geo Assistant and for other teams like Android Auto, and Geo Driving, and other Assistant teams)

  • Created video highlight reels to create a bridge between users and stakeholders. During my highlights I focus on what works well, and what does not (focusing on the failures alone would not actually represent the whole story) – and the safety implications for each finding.

  • Created bug lists for product polishing & potentially distracting especially before a public launch or shipping products with car manufacturer partners

  • Conducted literature reviews to consolidate past & current research on this nebulous and emerging design space, and to showcase how this research plays a role in the grander research ecosystem. It’s a nebulous space, and it helps to get a lay of the land to see how

  • All this was part of a larger effort to showcase the Google Assistant as a hero in driving use cases, as seen in the public 2019 I/O conference hosted by Google.

Reflections

This marks an interesting return to human factors research, an early love back in university.

Research and design for driving comes with higher stakes. It’s not a matter of waiting an extra five minutes for the checkout process to continue – for some people, it could mean the difference between life and death. User frustrations & pain points here can become an extra source of distraction and instigator for crash risks. Ultimately, safety is the #1 priority for drivers, and all the other important things that come out of it are simply a bonus. People could crash and seriously hurt themselves or others.

There’s a unique tension between trying to create something that’s safe (to account for the minority of drivers) and something that’s seamless (to account and please the majority of drivers).

Two evergreen challenges I often think about when it comes to driving related research:

  1. When does a prototype “graduate” from the driving simulator to testing on public roads?

  2. How do you negotiate the tension between designing for a safe experience (e.g., for the minority of drivers who are responsible for the majority of accidents) and designing for a seamless experience (e.g., interactions that would delight the majority of users).


Screen Shot 2020-06-12 at 6.05.20 PM.png
covid19_slidegif_competitive_analysis_compressed2 (1).gif
Screen Shot 2020-06-12 at 5.23.40 PM.png
Screen Shot 2020-06-12 at 5.38.50 PM.png
Screen Shot 2020-06-12 at 5.39.17 PM.png
Aro+Ha_0010.jpg
IMG_9910.jpg
IMG_9922.jpg
IMG_9923.jpg
IMG_9924.jpg
IMG_9925.jpg
Screen Shot 2020-06-12 at 5.39.06 PM.png
Next
Next

Bringing Conversational Experiences to Everyone