Driver assistance systems: What sensors can do
Advanced Driver Assistance Systems (ADAS) provide vehicles with additional comfort and safety on the roads. Nowadays, numerous driver assistance systems are used in cars, often combined in individual safety packages. This is made possible by ever more intelligent recognition of the vehicle surroundings because sensor technology is performing better and better. Ultrasound, radar, lidar, camera and more have clearly improved in performance. Highly complex software is at the heart of the increasingly powerful control units, optimising the algorithms' processes in order to respond quickly and trigger the right (re)action even in critical driving situations. In this way, critical situations can be overcome and accidents avoided.
Radar and ultrasonic sensors
Radar systems (mostly with 77 GHz) enable accurate speed and distance measurements – even at high vehicle speeds – but do not have a high angular resolution. They are used to avoid collisions, for example. One of their strengths is that they are not dependant on the weather. Mid-range and long-range radar systems with a range of up to 250 m are used in addition to short-range radar for detecting objects up to 30 m away.
Lidar sensors
The lidar sensor is an equally important sensor. The abbreviation means light-detection and ranging – an optical measuring system used to detect objects. The position of the object can be determined by the reflection of the emitted light at the object until the light returns to the receiver. In principle, therefore, it is a laser scanner that can also create a three-dimensional image of the surroundings. Lidar systems do not work with microwaves, but with light pulses from non-visible light ranges, i.e. near infrared light. They usually have a 905 nm wavelength, a range of 200 m in good weather conditions, a high angular resolution and 360° coverage. However, dazzling light and poor visibility conditions, such as fog, rain or spray, affect the range. Therefore, lidar is mostly used as an auxiliary system.
Modern cameras can also recognise and even distinguish between obstacles in front of the vehicle. Both mono and stereo cameras are used. The latter are able to detect obstacles in 3D without additional sensor technology. With a stereo camera, however, the installation space limits 3D imaging: the smaller the distance between the two camera lenses, the smaller the effective three-dimensional measuring range. This means that stereo cameras can see in 3D up to 50 m in front of the vehicle. Furthermore, the differences in perspective of the two images taken are too small to derive 3D information from them. Above this limit, the camera behaves like a mono camera.
The range of a mono camera is around 250 m, regardless of the installation space. By combining the images from several cameras and sensors, a three-dimensional image can be created. Cameras inside the vehicle can also detect whether the driver is tired or distracted. Cameras outside the vehicle (front and rear) also record the car's immediate surroundings and point out obstacles.
Infrared sensors
On the contrary, infrared cameras are used for night-vision assistants. They respond to heat radiation from objects. Converted to black and white images, the information is shown on the combined display. Cooler surroundings appear dark, people and animals appear conspicuously bright. Modern systems detect people and larger wild animals at a distance of up to 300 m. A warning signal sounds in dangerous situations. Depending on the headlamp system, it is possible, for example, to warn the person with short light pulses.
Merging sensor data
All relevant data from ultrasound, radar, lidar, cameras and other systems can be linked intelligently and in real time using sensor fusion. Looking forward, this is what makes automated driving possible in the first place. Redundancies, i.e. partially overlapping results for recognition of the surroundings, are an express requirement. Redundancies and plausibility checks, i.e. checking within the system whether the data about the surroundings has been recorded correctly, are the only things that largely stop the data being incorrectly interpreted. Therefore, depending on the driver assistance systems, the degree of automation and the vehicle class, we are dealing with an individual mix of information and sensors and ever more data that must be processed in real time. It is already a technological masterpiece!