A Glimpse into
Future Mobility Innovations at

BMW

Role
XR Interaction Design Intern
Time Line
February - December 2023
Tools
Blender, Unity, Stable Diffusion, After Effect, Mapbox, Rokoko, XREAL
Context
I joined the BMW Innovation lab located in Munich, Germany, worked on shaping the future vision of vehicle experience.

As part of a design-driven researching team, we explored how XR as a medium can enhance communication between the vehicle and its passengers, to create a more cozy and immersive in-car experience.





I contributed my navigation design to BMW team,
presented in CES 2024, Las Vegas
</media-item>
Legacy
Information matters. Beyond HUD

BMW has consistently led advancements in augmenting driving experiences with digital information. In 2003, BMW introduced the groundbreaking Head-Up Display (HUD) in the 5 Series, projecting critical driving data directly onto the windshield. By 2015, BMW took another innovative leap with the MINI Augmented Vision concept—AR eyewear designed to enrich the passenger experience through dynamic, real-time visuals.

As technology rapidly evolves, our Innovation Lab in 2023 is committed to furthering BMW's pioneering legacy. Reflecting on these historical milestones and analyzing current industry developments, we pose the question: 

How can augmented reality further enhance and redefine the in-car experience?
Research
Spatial computation for in-car experience
It all began with a single question:

How might we use AR to rethink what's possible inside the car?

What if we can place virtual content in the real-world environment?

In this way, AR glasses can do more...

  • Large movable field of view
  • Great extension to a regular head-up display
  • Entertainment for the passenger (Immersive experiences, Enhanced gaming, 3D videos)
  • Improve safety for the driver (Display driving-relevant content where you need itKeep eyes on the road

Advantages of BMW AR Glasses over in-car displays



Display Content where you want

  • AR glasses have a huge field of view compared to in-car displays. Content can be placed anywhere in the environment.


Personalized content for everyone

  • Customized content for everyone in the car, transforming every ride into the most unique experience.


Immersive riding experience

  • Virtual elements blend seamlessly with the real world to create fully immersive experiences.

Competitive Analysis (2023):



In 2023, automotive AR integration was prominently explored by companies such as Li Auto + Rokid, Geely + Meizu, and Audi + Leap Magic. While these competitors demonstrated engaging multimedia experiences and effective data synchronization, their approaches lacked deeply immersive interactions and contextually relevant experiences tied closely to specific locations. 

Recognizing these shortcomings as critical for meaningful AR passenger interactions, our efforts focus on overcoming these limitations to unlock the full potential of AR-enhanced in-car experiences.

Persona Context: CES in Las Vegas



This project was designed to show case in CES 2024 in Las Vegas, prompting us to tailor an experience flow for future-focused travelers—professionals who expect comfort, connectivity, and innovation on the move. This context shaped our priorities toward immersive, responsive in-car AR interactions.

Functions to bring to live



Our technology enables the integration of world-space elements into the in-car experience, allowing us to design beyond traditional screens and interfaces.

To explore these possibilities, we generated a wide range of ideas through brainstorming and mapped them using an impact-effort matrix. Each concept was positioned based on user impact and ease of implementation, then categorized into three themes: Vehicle Control Integration, Environmental Awareness, and Entertainment-focused. This method helped us identify high-value, feasible directions to prioritize in the design process.


AR Ride Use Cases Exploration



To better understand how AR can reshape the in-car experience, we explored a range of advanced use cases—from spatial navigation and hazard detection to immersive entertainment and parking assistance. These scenarios demonstrate the diverse ways AR can serve different driver and passenger needs.


Spatial Navigation

  1. AR glasses provide real-time, on-road navigation overlays, ensuring drivers keep their eyes on the road.
  2. Key route guidance, lane change prompts, and turn-by-turn directions are seamlessly integrated into the driver's field of view.



Signs & Hazard Warnings

  1. AR-enhanced hazard detection alerts drivers about pedestrians, cyclists, and unexpected obstacles.
  2. Real-time traffic sign recognition displays speed limits, stop signs, and warnings directly in the driver’s line of sight.
  3. Adaptive alerts for potential collisions, blind-spot monitoring, and lane departure warnings.


Parking

  1. Visualized parking distance alerts helps drivers park precisely with augmented parking guidelines.
  2. 360-degree AR overlays enhance spatial awareness by integrating vehicle cameras and sensors.


Entertainment

  1. AR glasses offer immersive media experiences, allowing passengers to watch movies, play AR-enhanced games, or browse interactive content.
  2. Multi-screen capability enables personalized entertainment without interfering with the driver’s focus.

Image resource: https://wayray.com/deep-reality-displa

BMW + XReal: AR Platform



To support our concept, we developed a real-time AR pipeline integrating vehicle data, sensor fusion, and rendering. Content is processed through a Unity-based interface and streamed to the XReal glasses, enabling precise spatial overlays with low latency.

Input Methods Exploration



From phone pointers to spatial buttons, we tested a variety of input methods. Some were ruled out early, while others—like gaze and gesture—proved effective. 

Looking ahead, we see strong potential in more ambient, low-effort input methods—such as voice control and touch-sensitive fabric panels—tailored to the dynamic in-car context.
Design
Giving Form to the Invisible
Designing for AR inside the car means shaping information that doesn’t live on a screen, but in space.  We focused on creating visual systems that remain clear, responsive, and elegant in a moving environment—adapting to lighting, perspective, and context. 

I designed a comprehensive set of simplified, modern visual elements tailored to AR glasses' see-through nature. The goal was to maintain legibility and spatial clarity in various lighting conditions, improve navigational awareness and safety.

Map System: Day & Night Visualization



To evaluate the legibility and spatial clarity of our AR navigation elements, I built a procedural visualization tool using OpenStreetMap data, enabling real-time previews within Blender. This allowed us to test design decisions against realistic environments, assessing how our UI would perform under motion and through the lens of transparent AR displays.
While a day mode was initially explored, it posed significant readability issues under bright lighting due to the see-through nature of AR glasses. Additionally, a darker visual system better adapts to the diverse range of interior materials and finishes across vehicles. We therefore shifted our focus toward optimizing for low-light conditions.

Map Periphery



Given the limited field of view in current AR hardware, I explored four distinct edge treatments to guide peripheral awareness and improve spatial continuity. These designs were tested to evaluate which approach best balanced subtlety with navigational clarity.

Procedural LOD for Spatial Hierarchy



To enrich the 3D visual experience and support spatial understanding, I designed three procedural LOD models with varying levels of geometric detail. Each version was tested to determine how information density affects clarity, immersion, and performance in AR navigation.

Prototype
From Design to Runtime
Beyond visual design, I developed a range of Unity-based prototypes to support spatial interaction, real-time rendering, and system integration. These included over 20 iterations of an AR navigation experience.

System Interface Diagram



To manage system complexity and ensure modularity, I designed and implemented a component-based interface architecture in Unity. This diagram outlines the core controllers I structured—including input handling, map logic, material management, and spatial visual effects—each separated by responsibility for clarity and scalability.

Viusal styling

To enhance visual appeal and achieve cohesive edge styling, I developed a set of procedural Shader Graphs in Unity. 

These shaders allowed us to control the look and feel of map boundaries and spatial transitions through dynamic animation and layered effects—informing many of our later aesthetic and interaction decisions.

I created a JSON-based visual stylesheet to systematically manage color, path weight, and edge behavior across map elements. 

This approach not only ensured a consistent visual language throughout the interface, but also made the system significantly easier to maintain, customize, and extend as the project evolved.
{
  "BMW_Landmark_Building": {
    "baseColor": "#58B4D4",
    "roofColor": "#2C8AAE",
    "emissiveHighlight": "#8ADFFF",
    "cornerRadius": 0.35,
    "material": {
      "useFresnel": true,
      "fresnelColor": "#A3ECFF",
      "fresnelPower": 3.0,
      "fresnelOpacity": 0.4,
      "useAO": true,
      "aoIntensity": 0.3,
      "aoMap": "AO_GenericSoft"
    },
    "outline": {
      "enabled": false,
      "color": "#B0F5FF",
      "thickness": 0.15
    },
    "animation": {
      "entranceFadeIn": true,
      "hoverPulse": true
    }
  }
}