/PROJECTS / AUTONOMOUS CARS

CONTENTS
Human Machine Interface
UX for Autonomous Mobility
Interaction Design
When I came to San Francisco, I remember the first time being spooked by a driverless car. While the novelty never vaned, the team took the challenge to solve for the future of mobility¹.
We had a shared an affinity for solving gnarly UX problems, so we pulled the curtain by evaluating the nature of the domain.

The context of solving for autonomous mobility space

Competitive analysis of industry players
We found out that there are six levels of automation⁶, each with its own set of challenges. A major revelation was that the transition from autonomous to manual mode was one the most notorious challenge⁷ in UX.

The six levels of automation
The fact is that hardware is not invincible⁸, and based on our trend mapping, there are multiple hoops to overcome before the industry achieves full autonomy. Human and AI collaboration is inevitable, since complete agency to the passenger is a key to ensure trust.
This is an undertaking in perfect executions of situation based scenarios. How might we help drivers to have a confident and collaborative take-over moment?


Future forecasting and break-even point in collaboration
The environment⁹ that the autonomous cars are in are largely unpredictable. Since cars are not emotionally intelligent, scoping out potential mistakes put us in a position to narrow our focus.

Illustration of sensors and challenges
Following that, we created a user journey to solve for. It consists ideal, alternate and edge case situations to contextualize interaction design¹⁰. Six situations were mapped based on interview insights, and brainstorming on canvas.

One journey with six scenarios
We accounted for the intensity of the cognitive load¹¹ between interactions. This led us to brainstorming ideal ways in which situations may pan out. We tackled several mind-bending questions and put them on paper.

Brainstorming through sketches
Introduction of haptics makes for 25% reduction¹² in glance time. Also, users find all screen interactions to be undesirable as a sole way of navigation. Thus, we aimed to use a good balance of physical buttons¹³ with consistent application of haptic technology.

Interaction patterns for car dashboard
Next, we put down four principles to aim for, based on our research synthesis. It is grounded in our problem statement, hierarchy of needs, and critical considerations involving the user.
The design specifications prioritized critical features to the left. We kept the adaptive layout size to match that of Apple Carplay. To avoid decision fatigue¹⁴, we restrained the contents to bare essentials.




Planning, mocking and mapping the dashboard layout
What is the emotional state the driver is undergoing? What are the primary actions of interference?
Provocations like that allowed us to zoom out to the objective experience. We created an "ASF Map" to center our process around the driver — the human behind the wheel.
"Action, Situation, Feeling" map for deep interaction design
Safety is a paramount consideration in multimodal communication. We decided on six different touch points, each with its unique way of alerting the user. Semantic color system¹⁵ was applied, with a focus on "Vision, Action, Feedback" model of interaction design.

Touch-points with the respective feedback
We picked up the baton to unify physical and digital design. We coalesced the journey map with the six touch points to orchestrate the complete experience. This was the "north star" reference guide for our detailed scenarios.

Scenario map with interaction between modalities
Sounds, lights, notifications and haptics to alert driver of autonomous zone. The driver makes a decision to switch, presses the buttons on the wheel. This sets the driver in autonomous mode.
Switching to autonomous driving
While in autonomous driving zone, the car detects incoming traffic. There is an alternate route that is faster, but needs manual mode switching. The car notifies the driver via sounds and pop up banners, and proposes new route.
Personal framework on designing for healthcare³
A potential collision is detected. The car tries to maneuver and evade it. Driving gets locked and the driver is alerted. If no response is gathered, it finds a safe space by itself and moves towards it.
Personal framework on designing for healthcare³
Approaching manual driving zone gets intense every passing second if the driver doesn't take control. To counter this, we introduced seat vibration to alert driver if in relaxed state. In addition light and beeping sounds, a stern voice instructions would ensure driver takes the wheel on time.
Personal framework on designing for healthcare³
Nothing frustrating than driving behind someone who is moving unreasonably slow. In this case, the driver can ask the system to over-take the car in front of them. The system gives a choice to select the car, and executes the action by itself. Else, the driver is given an option for manual control.
Personal framework on designing for healthcare³
When a sensor malfunctions during manual mode, the system alerts the driver. It then disables the switch to autonomous mode. In case of the car already being in autonomous mode, this gets serious. We trigger the alert system with multiple modalities same as that of "Danger Detection" scenario. This time, the driver has to take it in safe space if switch hasn't been done.
Personal framework on designing for healthcare³
All good things end.