Automotive night vision systems can help drivers – and cars – to see clearly when they need it most. Chris Pickering reports.
Night time is statistically the most dangerous period on the road. A study in the US found that an alarming 75 per cent of pedestrian fatalities occurred after dark, while collisions with large animals such as deer also go up substantially at night.
In an effort to reduce these risks, automotive manufacturers started experimenting with night vision systems back in the early 2000s. Since then the technology has increased in sophistication. And with the growing importance of ADAS functions and move towards self-driving cars, it’s no longer just drivers that need to see in the dark.
Infrared sensing is the obvious solution; it’s vastly cheaper than LiDAR, more accurate than radar and better in poor visibility than a conventional camera. Historically, the only real debate has been between active sensors, which use a dedicated emitter to bounce short-wave infrared off the target, and passive systems, which detect long-wave ‘thermal infrared’ without the need for illumination.
“Active systems are generally cheaper, but the main advantage of using a passive sensor is that it relies completely on infrared emitted by other objects,” comments Chris Posch, director of engineering for automotive applications at FLIR Systems. “If every car on the road had an active night vision system they’d all be blinding each other.”
FLIR supplies sensors to Swedish firm Veoneer, which works with the likes of Audi, Volkswagen and Porsche. In recent years, these passive infrared systems appear to have cornered the market when it comes to driver-focused night vision and they are also popular with those working on self-driving cars.
The situation is a little different when you look at ADAS systems, which have to be affordable enough for even the smallest city car. Here, radar is generally used at present – sometimes fused with visible light to improve accuracy. Posch, however, predicts that falling costs and more demanding performance requirements will push companies towards infrared: “There’s been a lot of recent press on how poorly some automatic emergency braking (AEB) systems perform. The regulatory bodies are likely to start making their qualification tests a little harder, which will nudge people towards more effective systems.”
A recent test of four different AEB systems by the American Automobile Association found that not one of them recognised a pedestrian at night, while collisions were only avoided 40 per cent of the time during daylight. “Long wave thermal cameras can solve this problem and make a real difference to AEB systems,” notes Posch.
There are two main trends that he believes will make these systems more attractive to OEMs. The first is the shift towards VGA sensors, which offer four times the resolution of traditional qVGA systems; the other is the use of a second sensor, such as radar or visible light, to work in parallel with the infrared camera. “That’s when the real value comes in – with a fusion of multiple sensors, much as you have with the human brain,” he concludes.
On test: Bentley Continental GT
To put night vision to the test, we turned to the Bentley Continental GT. Admittedly, you can find similar systems on some far more affordable cars these days; if you’re going to spend an evening driving up and down darkened roads in search of wildlife hiding in the hedges, however, there are few finer machines in which to do it. More to the point, the Bentley’s LED matrix headlamps are some of the best in the business, so the night vision system is going to have its work cut out to spot any hidden hazards.
The Continental GT uses a passive system, with a thermal camera mounted on the front of the grille (complete with its own washer unit to clear dust, grime and ice). Inside the car, a control unit analyses the data from the sensor, which is then sent to the virtual instrument cluster. It’s displayed as a greyscale image, with brighter shades representing higher temperatures. Body heat typically appears as a vivid white, but the filtering system works on contrast between the different areas, so it will still function with ambient temperatures up to 80 deg C (and down as low as -40 deg C). What’s more, it looks at emissivity of the surface, so even two objects at a completely identical temperature should theoretically stand out.
The clever bit is the recognition software. This highlights potential hazards, with the warning strategy adjusted to suit different environments, such as town centres and rural roads, using data from the navigation system and the vehicle’s cameras. A yellow box means that the car has spotted a moderate hazard, whereas a red box – accompanied by an audible warning – means that the car is reaching the critical stopping distance to avoid a collision.
“The detection range and classification varies largely on the size and shape of the hazard,” explains Martyn Brookes, advanced driver assistance engineer at Bentley. “As a rough guide, it can detect a 1.7 metre tall pedestrian at distances of between 8 metres and 90 metres. Small animals [up to 1 metre at the withers] can be detected at a distance of 7 metres to 70 metres, and larger animals [up to 2 metres] at a distance of between 13 metres and 140 metres.”
At present, it’s a ‘closed’ device, which relies solely on infrared sensing and has no direct control over the vehicle, although it will prime the master cylinder if it believes the driver may need to carry out an emergency stop. “We are developing future technologies for next generation systems that will be close linked to other ADAS functionality,” notes Brookes.
On the road
With the system switched on, we head off in search of warm bodies. The first thing you notice is that the shades in the display are all fairly subdued, so it doesn’t divert too much attention in a dark cabin. The sensitivity of the camera and the sophistication of the filtering is such that it actually produces a fairly passable image of the surroundings – you can pick out bumps in the road and branches on the trees. Generally, though, the display fades into your peripheral vision and you almost forget that it’s there.
Sure enough, people stand out well from the background and are almost always correctly highlighted by the software. Having failed to spot Bambi en route we decide to cheat and enlist the help of Ferne, a five-year old flat coated retriever, who doubles as a deer leaping into the road. The system doesn’t seem to be quite as reliable with animals – possibly because we’re a little too close – but it still registers our canine stunt double more often than not.
So far, I’m impressed rather than bowled over by the capabilities of the night vision system. On the drive back, however, we enter a dimly-lit village on dipped headlights and a yellow box suddenly illuminates on the screen. It’s at that point that I notice a pedestrian lurking unseen at the side of the road. For the first time, it feels like the system has offered something beyond my own hazard perception. And that’s driving with fresh eyes and deliberate care.
Of course, the human brain applies its own degree of filtering. Unlike the car, I knew we were going to take a turning long before we reached the rogue pedestrian. As such, I wouldn’t say it’s a conclusive victory for technology over human intuition, but there’s no question that the system works. Bundled in an options pack with a suite of other features – as it is on the Bentley – I can see the value. It’s relatively unobtrusive when in use and it’s easy to turn off. And there’s always the possibility that one day it might spot something that you don’t.
Source: https://www.theengineer.co.uk
CUT COTS OF THE FLEET WITH OUR AUDIT PROGRAM
The audit is a key tool to know the overall status and provide the analysis, the assessment, the advice, the suggestions and the actions to take in order to cut costs and increase the efficiency and efficacy of the fleet. We propose the following fleet management audit.