News
News
Turning Automotive Windows into the Ultimate HMIs 2020-08-24

CONNECTED VEHICLES EQUIPPED WITH ONBOARD processing power and advanced sensors that collect gigabytes of data are becoming the norm in the automotive industry. As a result, the often‐overlooked world of human‐machine interface (HMI) design is now in the limelight. Indeed, automotive original equipment manufacturers (OEMs) increasingly focus their resources on creating effective, intuitive HMIs to better leverage technological advancements in today's vehicles.


One area of particular interest is the use of windows as an HMI display, which enables communication with drivers, passengers, and the outside world. Thus, here I explore potential applications and technologies for automotive window displays.


Augmented Reality Head‐Up Displays


Head‐up displays (HUDs) are a great example of how to use a vehicle's windscreen as a display. General Motors (GM) was the first to embrace the technology: In 1988, it built 50 Indy Pace Car edition Oldsmobile convertibles equipped with HUDs that projected a digital speedometer and turn‐signal indicators. Much like today, GM's original HUD displayed basic information via a relatively small two‐dimensional (2D) image that floated out near the car's front bumper. With technological advancements in today's vehicles—such as advanced driver assistance systems (ADAS) and onboard navigational systems, there is a need for a more effective HMI. To support this requirement, OEMs are working on next‐generation augmented reality (AR) HUDs.



Unlike traditional HUDs, AR HUDs have a wider field of view (FOV) and can interact with more of the real‐world road scene. They also project graphics further out, enabling graphics to fuse with and mark real‐world objects. To create the illusion of fusion with the real world, the graphics must be projected out a minimum of 7 meters (m) from the driver. The overall visual effect and driver experience improve when graphics are projected out even further, with the majority of AR HUDs supporting 10‐ to 15‐m projection distances. The distance at which the graphics are projected is called the virtual image distance (VID).


Among the key challenges in designing an AR HUD are meeting luminance, solar irradiance, and size requirements. If you double the display's area, you must increase the HUD imaging source's light output by an equal factor. The same relationship holds true for the eye box: Doubling the eye‐box area doubles the required light. Double the eye box and display area, and you'll need to increase the light output by a factor of four. Choosing an efficient imaging technology that can meet luminance, power, and thermal requirements is an important step in the AR HUD design process.


Managing solar irradiance in today's traditional HUDs already poses a significant design challenge. Managing solar irradiance in an AR HUD (with a VID of 10 m or more and a large FOV that lets in more of the sun's energy) is even more difficult. The higher optical magnification of an AR HUD concentrates the solar irradiance to levels that easily can damage the HUD's imager panel. Solar irradiance must be determined carefully, with the AR HUD designed to handle a worst‐case temperature rise without derating or turning off.


With its use of an intermediate diffuser screen as the image source for the HUD, DLP technology has excellent solar irradiance performance. The diffuser screen passes and disperses the concentrated solar irradiance, limiting the temperature rise (on the diffuser's surface and the AR HUD's interior) to manageable levels.


By far, the most significant challenge in AR HUD design today is size. With a traditional optics approach that uses a fold mirror and a large aspherical mirror (Fig. 3), the HUD's size easily can approach 20 liters (20,000 cm3). Most OEMs simply don't have this much free space in the dash. To solve this problem, the industry is exploring waveguide and holographic film technologies.


In a waveguide, light is injected into a small port at one end and travels along the waveguide via multiple total internal reflections. Holographic or diffractive optical elements are then used to emit portions of the light along the length of the waveguide. This expands the beam and preserves the incoming light's ray angle, also known as pupil expansion.


A holographic film has microscopic structures that have been printed into the film and designed to act as a holographic optical element (HOE). The HOE can be designed to replace traditional optical elements such as lenses and mirrors. In a HUD, the HOE typically is designed to replace the large aspherical mirror. Both waveguides and holographic films significantly shrink package volumes, making it easier to fit an AR HUD into the vehicle's dash. Both technologies use advanced optical elements to replace traditional mirrors. Waveguides are installed in the vehicle's dash similar to traditional HUDs, but their height and overall package volume are significantly smaller. With a holographic AR HUD, a small projector with magnification optics is installed in the dash and a holographic film is laminated into the windscreen. AR HUDs based on a waveguide and holographic film, respectively. These new technologies not only shrink HUD package size but also enable much larger FOVs, supporting HUDs with 15 × 5‐degree FOVs or larger. Both waveguides and holographic AR HUDs require laser‐based light sources.


DLP technology is light source‐agnostic and supports both laser and LED illumination sources, making it an excellent imager choice for these new technologies. OEMs are working to bring both waveguide and holographic film AR HUDs to market by 2025.


——Source:onlinelibrary.wiley.com

leave a message welcome to fic
If you have any problem when using the website or our products, please write down your comments or suggestions, we will answer your questions as soon as possible!Thank you for your attention!

Home

Solutions

about

contact