News
Computing Applications News

The Road Ahead for Augmented Reality

A heads-up look at augmented reality head-up displays.
Posted
  1. Article
  2. Author
automotive head-up display, illustration

Automotive head-up displays (HUDs), systems that transparently project critical vehicle information into the driver’s field of vision, were developed originally for military aviation use, with the origin of the name stemming from a pilot being able to view information with his or her head positioned “up” and looking forward, rather than positioned “down” to look at the cockpit gauges and instruments. The HUD projects and superimposes data in the pilot’s natural field of view (FOV), providing the added benefit of eliminating the pilot’s need to refocus when switching between the outside view and the instruments, which can impact reaction time, efficiency, and safety, particularly in combat situations.

In cars, the main concern is distracted driving, or the act of taking the driver’s attention away from the road. According to the National Highway Transportation Safety Administration, distracted driving claimed 3,142 lives in 2019, the most recent year for which statistics have been published. Looking away from the road for even five seconds at a speed of 55 mph is the equivalent of driving the length of a football field with one’s eyes closed.

As such, the desire to ensure that drivers keep their eyes focused on the road, instead of looking down at the gauges on the dashboard, was the impetus for the development of HUDs suitable for use in production automobiles. The first automotive HUD that was included as original equipment was found on the 1988 Oldsmobile Cutlass Supreme and Pontiac Grand Prix; both were monochromatic, and displayed only a digital readout of the speedometer.

Thanks to the increasing inclusion of a variety of automotive sensors and cameras, advanced driver assistance system (ADAS) features and functions (such as automatic braking, forward collision avoidance, lane-keeping assist, and blind-spot monitoring, among others), and more powerful on-vehicle processors, automakers have been installing HUD units in commercial vehicles that provide more essential driving data, such as speed, engine RPMs, compass heading, directional signal indicators, fuel economy, and other basic information, allowing the driver to concentrate on the road instead of looking down to check the dash or an auxiliary screen.

The technology enabling most types of HUD is based on the use of a processor to generate a digital image of data coming from sensors. These images then are digitally projected from a unit located in the dash of the car onto a mirror or mirrors, which then reflect that image onto either a separate screen located behind the steering wheel, or onto the vehicle’s windshield, directly in the driver’s forward view. Common projection and display technologies used include liquid crystal display (LCD), liquid crystal on silicon (LCoS), digital micromirror devices (DMDs), and organic light-emitting diodes (OLEDs), which have replaced the cathode ray tube (CRT) systems used in the earliest HUDs, as they suffered from brightness degradation over time.

The HUDs that project the information onto a separate transparent screen are called combiner HUDs; these were popular because the physical space required to install the system was modest, and because the system was fully integrated, OEMs did not need to design a system that accounted for each vehicle’s unique windshield angle or position. However, this type of HUD was limited by several factors; namely, the optical viewing path of a combiner HUD is shorter than looking through a windshield, and the driver’s eyes must refocus slightly to the shorter visual distance when switching between looking out the windshield and checking the display. Furthermore, there is a practical limit to the size and field of vision (FOV) offered by combiner units; adding mirrors and a larger combiner screen would apper obtrusive and less elegant in a modern vehicle than simply using the windshield as a display surface.

Because HUDs were far from being standard automotive equipment in most vehicles, companies such as HUDWAY and Navdy had produced phone mounts and screens designed to allow a smartphone to operate as a head-up display. Essentially, these designs functioned as combiner systems, in that they required a separate screen on which to view the display and suffered from many of the same limitations as OEM-equipped combiner HUD systems. While Navdy went out of business in 2018, HUDWAY is still accepting orders for its HUDWAY Drive system at a cost of $229 per unit.

The technical limitations of combiner systems have driven most automotive OEMs to offer HUDs that project information directly onto the windshield and contain a far greater amount of data, known as W-type HUDs. These more-advanced systems incorporate ADAS system status information (such as displaying the status of adaptive cruise control systems, automatic braking systems, collision-avoidance technology, infrared night-vision technology, lane-keeping assist technology, and, eventually, semi-autonomous self-driving system data.

The most advanced systems include augmented reality technology, which involves superimposing specific enhanced symbols or images into the HUD onto real-world objects or roadways to provide more information, detail, and clarity to the driver. Some systems will also incorporate data from GPS navigation systems, such as clear directional graphics, street names, augmented lane markings, signposts and route numbers, and even representations of other vehicles/objects on the road. Examples of vehicles that include this technology today include the Audi Q4 E-tron, Mercedes Benz S Class, and the Hyundai IONIQ 5.

The major challenges faced by HUD designers include collimation, luminance, and clarity. Collimation refers to the aligning of light rays from the projector so they are parallel to one another, so the projected image appears to float ahead of the vehicle and seamlessly blends in with the outside world. This ensures the driver can watch the road and the display without refocusing. OEMs need to design the incorporation of each system into each vehicle so drivers of different heights can still have their eyelines in the correct position to view the HUD image properly.

There are a few different ways to accomplish this; adjusting the eyebox height by using a small motor to tilt one of the HUD mirrors up or down; applying more graphics processing combined with a driver monitoring system to compensate for the change in alignment, or designing a larger eyebox so the HUD’s optical axis does not need to be adjusted and graphical alignment is maintained. Often a combination of the latter two approaches are used to provide an elegant system for providing optimal viewing angles for drivers of differing heights and seating positions.

Luminance, meanwhile, refers to the brightness of the display, which must be visible in all lighting conditions, from bright, sunny days, to overcast conditions and, of course, at night.


“The risk of cognitive overload due to screen clutter caused by displaying location-based advertising messages will need to be addressed.”


Another challenge today faced by HUD designers is clarity: deciding what information should or should not be displayed on a HUD and how it should be displayed. Moreover, there is a concern about the potential for information overload. “The risk of cognitive overload due to screen clutter caused by displaying location-based advertising messages will need to be addressed,” says Anuhab Grover, a mobility research analyst with market research firm Frost & Sullivan, referring to the likelihood that external partners will want to display to drivers additional messages or graphics that are not critical to driving. “It will be critical that AR is used in a minimalistic way and displays only relevant information needed to improve the driver’s perception. For this, user experience and user interface design will be highly important.”

Augmented reality technology can assist in this task by using graphical images that can convey information that can be easily understood by the driver, while also increasing the field of view and reducing the physical volume of the HUD system.

Some of the notable technologies that have been prototyped and are expected to make their way into production vehicles within the next five years include AR HUDs that utilize laser holographic and wave guide projection technologies. In laser holographic AR HUDs, a laser is used to beam AR content from the dashboard onto the windshield, which is embedded with holographic film that is embossed with images that appear with a three-dimensional effect, so that projected AR data appears to be overlaid onto real-world objects. Similarly, holographic displays that use wave guide technology, or an optical pathway through which light can travel, leverage the optical principles of diffraction, wavelength diversity, and reflection, in order to project the image onto a hologram incorporated into the windshield, instead of using a traditional DLP projector. The key benefits of these approaches include increased image sharpness and the creation of a larger eyebox that can allow users to adjust their seats or posture without losing the ability to view the AR HUD.

Grover says AR HUDs in development that utilize these types of advanced technology can provide sharper images across a wider field of view, while requiring far less physical space to install in a vehicle, a key consideration for OEMs. Today’s AR HUDs typically take up about 15 to 20 liters in physical volume, or just a bit more than a typical shoe box, while newer designs using alternatives to DLP projectors and mirrors may reduce them to around one-sixth of that size.

For example, German multinational automotive parts manufacturer Continental Automotive and Sunnyvale, CA-based DigiLens, which makes holographic waveguides for augmented extended reality (XR) displays, have jointly developed a next-gen AR HUD prototype which uses a holographic waveguide projector to display visual information directly on the windshield.

Similarly, Swiss-based HUD maker WayRay, which has partnered with a wide range of automotive OEMs, uses holographic laser projector technology incorporating a holographic optical element (HOE) that is used to display the holograms, and which can be molded onto a flexible or curved surface, like a windshield. This system also uses a picture generating unit (PGU) installed on the car’s dashboard comprised of a laser module, a digital light processing unit, and correction optics modules. The HOE is used to reflect the image projected by the PGU.

Meanwhile, Panasonic’s AR HUD also incorporates laser holographic technology, which was developed by U.K.-based holographic technology startup Envisics, and integrates spatial AI technology developed by California’s Phiar Technologies. In addition to displaying the road ahead, this system can identify and warn the driver about foreign objects, other vehicles, pedestrians, and cyclists, while also providing information such as the height of an upcoming overpass.

Tier-1 automotive parts supplier Denso is also working on an AR HUD, though the company declined to provide specifics, citing confidentiality agreements with OEMs.

Another current prototype comes from Scotland’s Ceres Holographics, which is working on the development of a Holographic Transparent Display and AR Systems that utilize thin-film holographic optical elements (HOEs). The system uses a small projector embedded in the car’s dashboard and a layer of holographic film laminated within the glass that makes up a windshield to project an in-plane hologram of the car’s instrument cluster and navigation directly into the driver’s line of sight.

According to Andy Travers, CEO of Ceres Holographics, the key challenge of incorporating the film into the windshield is rooted in the laws of physics. “The issue with holographic AR HUD, and in particular when the film is in the windshield, is with incoming light at particular directions,” Travers says. “In certain circumstances, it can produce a rainbow artifact on the windshield, which is deemed distracting. So, while we and other companies try to address this issue, there is a debate in the automotive companies as to what level of artifact is acceptable or not acceptable.”

The information presented on an AR HUD is the result of sensor fusion, which blends internal vehicle sensory information, including conventional HUD data with input from radar sensors and other vehicle sensor data, with digital map and GPS data to provide a virtual view of the world outside the vehicle, according to Grover. “In the future, advanced AR HUDs will be capable to project complex graphics fully integrating and adapting with the driver’s environment,” Grover says. “For instance, even on a foggy night using a car’s thermal sensors, AR HUDs will be able to detect and display animal or human objects,” improving safety in such low-light or otherwise adverse weather conditions.

According to market research firm Gartner’s 2020 Priority Mix for Automotive Technologies, non-AR HUDs (those that simply project information such as speed, RPMs, or compass heading in front of the driver, but do not superimpose data onto real-world objects) are expected to become mainstream technology within two years, while AR technology is projected to become mainstream in consumer vehicles within five to 10 years. Major automotive OEMs and Tier-1 suppliers focused on developing and commercializing AR HUDs including Daimler, BMW, Jaguar, Hyundai, and others have forged partnerships, strategic alliances, or invested in technology companies, such as Porsche (WayRay), Volkswagen (SeeReal Technologies), General Motors and Hyundai Mobis (Envisics), Continental (DigiLens), and Ford (Mishor 3D).

AR HUD technology likely will help drivers stay more informed of their vehicle’s operational systems, navigation directions, and infotainment systems (such as radio stations), while letting them stay focused on the road ahead. However, “prioritizing situational and contextual content will be crucial for AR HUDs,” Grover says, noting that “an AI and deep learning application engine could play that role by prioritizing information [that] the driver needs to see.”

*  Further Reading

Ogbac, S.
What are Head-Up Displays?

And Are They Worth It?, Motor Trend, July 14, 2020, https://www.motortrend.com/news/head-up-display/

Augmented Reality Head-Up-Display on Audi Q4, AirCar, March 9, 2021, https://www.youtube.com/watch?v=4HW0WsqDP-E

Hudway: https://hudway.co/glass

Holographic Film – What is it, and where is it used?

https://nobelusuniversity.com/2017/05/05/holographic-film-what-is-it-and-where-is-it-used/

Top Head-Up Display Companies

https://www.ventureradar.com/keyword/Head-Up%20Display

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More