Uber Augmented Reality Lens

Autumn Jacob
7 min readJan 31, 2019

Note: This redesign was done for a University of Michigan Design Challenge

“Yo, guys, let’s go our Uber is here!” My friends and I shuffle out of the bar, huddled together to brace the arctic temperatures of a Michigan winter. Stepping outside, we frantically search for the same black Prius, forced to endure the hassle of running into the street to read the license plate of every approaching car…

“I found it, license number: J…W..6.”
“Hi, Uber for Autumn?”
When the driver says no, my thoughts now move in circles, “Should I call him, text him? Shoot, I hope he doesn’t leave!”
I’m freezing… and overwhelmed.

Time and time again I am faced with the same situation of struggling to locate the Uber vehicle, imagining the day when I can find my ride within seconds…How can I design a solution to a problem that exists within our physical world? A solution that goes beyond the interface of Uber’s digital application to eliminate this inconvenience once and for all.

Layering digital elements onto the dialog of our world, we can streamline a safe, effective, and personal experience for Uber drivers and customers. I have harnessed the power of augmented reality to create a feature that reduces this integral pain point in the customer/driver ridesharing experience.

01. Understanding the Problem

The Challenge

How can I efficiently and safely locate my Uber in a sea of moving vehicles?

Analyzing Current Solutions

Icons: © 2019 Uber Technologies, Inc.

Bottom line is the current solutions to locating an Uber driver are not customized to the problem at hand, they are simply general methods of searching that are inefficient and inconvenient for both the rider and the driver.

This Augmented Reality (AR) plug-in weaves the Uber map location directly into the real-world view of a rider such as Autumn, standing outside the bar, searching for her driver. The prototype below exemplifies how my solution uses AR to enable riders to scan their surroundings and have the Uber application locate and direct them to their driver more efficiently than any current methods available.

Final Prototype

The Challenge

My solution transforms the way individuals interact and visualize data creating an entirely new paradigm for how we define an interface. The exploration of data is now a part of our world, making our lives easier as human beings. Designing in three dimensions, I was challenged to reimagine the structure of the UI to adapt to new requirements and constraints. From brainstorming, researching, sketching and wireframing, I was able to create a final AR solution that successfully aligns with my goals. The integration detailed below perfectly exemplifies augmented reality in action, and harnesses how it can tangibly apply visual technology to simplify the Uber experience for both the driver and rider.

02. Approach

User Analysis

Who will be using this feature?

  • Individuals with busy lifestyles
  • Individuals that live in crowded cities
  • Individuals with disabilities

Why will individuals be using this feature as opposed to Uber’s existing solutions?

  • Efficient, Safe, Engaging, Easy, and Personalized
    *Contrary to Uber Beacon, this AR solution requires no hardware or installation process. The software will be integrated directly into the application, to enable immediate adoption and widespread use.

When will individuals be using this feature?

  • Leaving from overpopulated pickup locations (ie: airport, bar, shopping center, ridesharing parking lot)
  • When users are in a hurry/running late
  • When users enter a foreign country
  • When users face disabilities (ie: impaired vision, mobility or speech)

Personas

Madi: A 25-year-old traveler visiting India for the first time, with no understanding of local language or customs. She cannot read the airport signs and feels overwhelmed in a new environment searching for her Uber driver.

John: A 35-year-old walking out of a crowded shopping center with a recent surgery that left him forced to rely on crutches to walk. He cannot venture out into the street to verify if it is his driver every time a black sedan pulls up at the busy rideshare pickup lot.

Goals

  • Efficiency — Streamlining the process of locating the vehicle both for the driver and the rider.
  • Saving time and money — A driver can now get more rides in during their working hours, and the rider can get to their destination more quickly and with higher satisfaction rates by reducing the friction of the vehicle location process for both parties.
  • Safety — Decreasing the number of people entering the streets in oncoming traffic to read licenses plates while searching for their driver.

Solution Explained

Using Uber’s system icons, I was able to create a custom icon for my Uber AR Lens feature. All methods connecting the rider to the driver are at the top of the application screen. I kept the location of this feature congruent with the current visual hierarchy. This highlights key call-to-action points, and avoids disrupting Uber’s current user flow.

Once the icon is clicked, the user will enter a state of augmented reality. A screen will prompt the user with the phrase “Scan to find your Uber”. While developing this concept, I turned to my natural process of searching and scanning my surroundings. I replicated this same eye-motion and movement within AR to drive a fluid and instinctual flow, enabling the mobile device to do the locating for the user.

Safely, the user can stand on the sidewalk and scan the surrounding area with their mobile device to locate their Uber. The user will initially see an augmented reality pop-up that matches with the application aesthetic and functionality. The AR lens detects free space within the user’s environment and leverages Artificial Intelligence (AI) to meticulously place the pop-up in the user’s reality, clearly directing the user to the vehicle.

Within AR, content must be kept to a minimum to eliminate feelings of information overload. I focused on what the user needs to know; data that enables trust such as the driver’s name, photograph, vehicle model, and license plate. This pop-up acts as a 2-factor authentification, eliminating any question of “Is this really my Uber?”

As the camera scans the surroundings, a clean and minimal grid captures the entirety of the vehicle. I scaled up Uber’s grid guidelines to create this effect and focused on maintaining consistency between the application and the AR environment. Within AR, it is important to give users a sense of where they came from — though users are put in a state of hyperreality they should still feel the presence of Uber’s visual identity. At the same time, the user receives explicit, verbal feedback as the text changes to “Uber found” at the top of the screens.

Next Steps

  1. Market Analysis — Conduct in-depth research and prioritize locations to pilot test. Brainstorm future applications of the software.
  2. Usability Testing — Launch pilot program to select group of super-users in one location, such as an airport rideshare parking lot. These users should be on the high-end of Uber trips taken so they have experience and can communicate feedback on ease-of-integration from the user perspective. Then we can iterate and redesign based on their suggestions.
  3. User Research — Widen the test launch to select individuals from each of the user groups identified above (ie: residents of urban areas, disabled, students). Beyond quantitative data from surveys, conduct in-person or phone interviews to identify and address underlying pain points.

Predicting the Future: The Disruptive Power of Augmented Reality

With AR we are painting the world with information and bettering humanity by simplifying people’s daily lives. This easy integration of Augmented Reality will open up new doors to future applications of the technology. It will demonstrate how applying the intangible world of data to our tangible reality will make mundane tasks more efficient and enjoyable. You’ll never have to wait for your Uber again, you’ll walk into a store and be able to try on clothes digitally, and you’ll know what’s in your fridge without opening the door. There will be no more fear of this new, budding technology. Consumers will be asking for more.

--

--