Is the automotive HMI of the future a blank expanse of wood, cloth or plastic?
In a concept BMW dubbed “shy tech” in its Vision iNEXT, controls lurk underneath the surface, ready to be engaged by a touch, gesture or voice. “Mechanical controls are on their way out,” says Ali Foughi, CEO of NextInput, a supplier of sensing technology for industries including automotive. NextInput’s solutions use infrared sensors.
Foughi may be a bit biased but Ulrich Lueders, director, strategy and portfolio, human machine interface business unit at Continental, agrees, saying: “Shy tech is one of the most interesting technologies” for the HMI. Foughi notes there are advantages beyond styling – mechanical gadgets inevitably break with use, while he claims NextInput’s controls will work up to 50 million times.
He adds: “These controls are very intuitive and also safe. While you’re driving you don’t have to look down and make sure the control is actuated.” Of course, it’s a problem if the driver needs to feel around the dashboard to find the right spot. The key to safety, Foughi says, is providing confirmation of the touch, which can be via LED lights below the surface, imprints on the smart surface or haptic feedback when the right spot is touched.
Constructed reality
Cameras and sensors studded around the car let ADAS systems warn the driver of unseen hazards. Augmented reality (AR) and heads-up displays are one way to provide this. Continental’s Transparent Hood is another approach that gives drivers actual imagery of the surroundings that can’t usually be seen.
You could think of this as the opposite of augmented reality. Instead of adding information to the extant visual field, this system extends the visual field itself. The system uses cameras installed at the front of the radiator grill and at the bottom of the side mirrors. An intelligent image processing algorithm combines that camera imagery with other vehicle sensor data to construct an image of what’s under the vehicle. The result is a view of what’s hidden by the car’s hood.
Lueders sees this technology as especially important for off-road driving, where a deep pothole or large stone can wreck an axle. To provide this kind of experience, he says: “It is required to have a seamless link between the ADAS and cockpit domains, plus the software competence to process the ADAS data in real-time.”
Behind the HMI
To enable Transparent Hood and other HMI innovations, Continental integrated all cockpit domains into one Cockpit High Performance Computer (Cockpit HPC) in partnership with Pioneer. “One part of the complexity with the classical separated devices like the infotainment head unit, cluster unit and others lies in the data sharing, networking and synchronizing among separated devices, which may be even sourced to different Tier 1s. When integrating several domains into one device like the Cockpit HPC, you create a common base for all the functionality and data,” Lueders says.
This allows for a common design language for the different domains while improving the ability to let them share visual, audible or haptic content. At the same time, according to Lueders, the HMI experiences remain safely separate from the vehicle’s core functions.
Swifter HMI design
Some HMI designers are taking their cue from game developers. Epic Games launched a human-machine interface initiative to make it easier to create automotive HMI, infotainment, and digital cockpit experiences using its Unreal Engine development environment. General Motors said it would use Unreal Engine in the digital cockpit of its GMC Hummer EV. Unreal Engine lets UX designers create not only the look of the interface but also implement functionality so that they can iterate ahead of handing off their designs to the engineers.
Vectorform, a design consultancy, is one of Unreal’s partners in the initiative. According to its executive experience director, Clemens Conrad, Unreal’s real-time engine lets designers build entire projects without having to write code, letting them skip several steps in the traditional HMI development lifecycle.
Dan Dobbs, executive director of engineering, says the platform also addresses the challenge of working with multiple operating systems within a single vehicle. With the instrumentation running one OS, the head unit using another, plus a proliferation of screens, he says: “Getting those to work together is a challenge.”
Switching to Unreal Engine, on the other hand, “lets you come up with 3D experiences and put them into different environments, then experience what it’s going to feel like ahead of time,” he adds. For example, designers can test the UX using a virtual headset. “The rapid prototyping can get ideas out there faster, let you make decisions better and share them much easier than in the past.”
Voice plus
Voice controls will continue to expand in the cockpit, our experts agree, while gestures could become a thing. NextInput has a gesture-control solution based on an unannounced technology. Says Lueders: “Control via voice like digital assistants is a clear trend already and will enhance in the next coming years. Operating of the car will be simplified through speech over time. Because not everyone likes to speak to a machine, there are other ways to interact, for example, gestures or haptics. The interaction of the future is a multi-modal approach where functions can be approached on multiple ways.”
by Susan Kuchinskas
Source: https://www.tu-auto.com
CUT COTS OF THE FLEET WITH OUR AUDIT PROGRAM
The audit is a key tool to know the overall status and provide the analysis, the assessment, the advice, the suggestions and the actions to take in order to cut costs and increase the efficiency and efficacy of the fleet. We propose the following fleet management audit.