How smart windscreens will revolutionise safety
In this blog piece by PG Glass, it’s clear that today’s simple piece of glass shielding the occupants of a vehicle will soon be so much more…
Experts predict that consumer demand for ever-safer vehicles, and up-to-the-minute technologies, will drive their implementation despite barriers to cost. Lisa Dorn, reader in driver behaviour and director of the driving research group at Cranfield University, says that the automotive industry can learn a lot from fighter pilots, who have been using this kind of technology for decades.
“There’s a lot of transferrable knowledge from the aviation industry that can assist here: for pilots there are many complex instructions and information, so it’s a question of how these can be presented without distracting the driver.”
“Safety will be paramount in the development of the connected car or ‘networked vehicle’ (the information that is relayed between vehicles, with vehicles talking to each other) but also in the way automakers will embed intelligent sensing systems within them, which will probably be camera-based,” says Chris Davies of Belron Technical.
Heads-up display (HUD)
Augmented reality displays will lead the way into this brave new world of safer motoring. A well-established fixture in fighter pilot’s cockpits, HUD has trickled down from military aviators into the hands of drivers, as part of a new wave of affordable smart-windscreen technology.
The first signs of the exciting future generation of heads-up augmented reality display could be seen at the New York Motor Show 2014, where Land Rover’s Discovery Vision concept incorporated a smart glass roof and windows capable of displaying images and deploying eye-tracking technology. In the future, smart windscreens will be on the lookout for pedestrians, too.
Ludger Kersting, director, marketing and sales, business to business at ADAC, predicts car windscreens will be able to integrate “warnings for pedestrians who the driver doesn’t see: a sensor that can realise pedestrians are coming from the side, or crossing the street, from behind the car. And specific signs for different types of danger – snow, ice on the road, wet conditions”.
“The most important things for future HUDs are safety-relevant information,” says Philip Puls, head of technical service at TÜV SÜD Auto Service. “That means braking distance to the vehicle in front, or the emergency brake distance.”
Augmented reality GPS will be looking at the bigger picture rather than just immediate pitfalls and journey directions. “There will be windscreen-to-cloud communications,” says Hans Roth, director of business development at Harman. “The windscreen will be able to gather traffic information uploaded to the cloud by other vehicles and use it to suggest route changes and predict possible problems ahead.”
“And we can’t ignore the integration of the motor vehicle into personal technology ecosystems,” adds Chris Davies of Belron. “We see windscreens becoming the key interface. In that sense, we expect heads-up augmented reality displays will be only a stepping stone to a safe ‘full-screen’ technology.”
Eye tracking
Eye-tracking sensors will be embedded in every smart windscreen. They will monitor a driver’s alertness levels and nudge their vehicle to react automatically to hazards the system knows the driver has failed to spot. Eye tracking will enhance the effectiveness of HUD systems, ensuring that information projected on the windscreen is always in the driver’s line of sight. Crash recorder systems will be part of this, too.
“Eye tracking has huge potential to cut the number of ‘fail-to-look’ crashes; the most common car accident especially among young drivers who, research shows, don’t look into the distance properly,” says Neil Greig, director of policy and research at the Institute of Advanced Motorists.
Jim Motavalli, contributor to The New York Times, agrees: “Considering the number of people who fall asleep at the wheel, applications that can be used to ensure that the driver is alert are a really good use of technology.”
Touch or talk?
Touch-screen, voice control or gesture – which technology will win? It’s tempting to believe that smartphone-style touch-screen technology will migrate into every cabin over the next decade, but there are huge practical difficulties.
“The distance to the windscreen in a normal car has become bigger and bigger in the last couple of years. The driver has to move his or her whole body forward to touch the windscreen. It’s just too far away to use as a touchscreen,” says Kersting.
For drivers to reach out to touch the windscreen while driving is a physical manoeuvre fraught with potential dangers. “Anything that takes a driver’s hands away from the main controls of the vehicle, is a potential distraction,” says Greig. Instead, voice will become the primary tool for controlling the smart windscreen by 2025.
“I would see voice control or voice-activated technology as being less intrusive and potentially less of a health and safety consideration,” says Julie Jenner, director of ACFO.
Dorn agrees: “It’s a no brainer – the most obvious way that humans interact is via language, which is probably the most effective way to make the user experience as safe and comfortable as possible, but research needs to understand how best to use voice-activated technology without impacting on safety,” she says.
Heads-up augmented-reality display systems in smart windscreens will make voice-activated technology far safer than current iterations. “Today, if we use a voice command to make a smartphone call we often have to look down at the instrument fascia or the infotainment screen to read screen prompts,” says Roth.
“By projecting prompts and information onto the windscreen, we ensure that the driver keeps his or her eyes on the road ahead at all times.”
Scott Sinclair, industry manager automotive at Google, describes it as “empowering drivers not to have to look for buttons. If you can do this, voice control is a great thing to have in vehicles”.
A further evolution could see the introduction of windscreen sensor systems that recognise gesture commands. Toyota is collaborating with Microsoft on a concept Sienna minivan, that uses a version of the software brand’s Xbox Kinect system to interpret gestures in 3D.