Denso Corporation, one of the world’s largest automotive suppliers, today released a retrofitable driver status monitor to help reduce the number of traffic accidents involving commercial vehicles, such as trucks and buses.
The safety product checks for distractions, drowsiness, sleep, and inappropriate posture based on the driver’s facial image, which is captured through a camera installed in the cabin. If drowsy or distracted driving is detected, the monitor voices an alert. It is now sold by Denso sales and service stations across Japan and will be released overseas later this year.
Large commercial vehicles can cause serious damage in an accident, and many trucks and buses do not have the latest safety devices installed because they have been in use for so many years. The driver status monitor is retrofitable to existing vehicles on the road, and can therefore accelerate the introduction of safety devices to large commercial vehicles.
A driver’s condition is detected by the product and recorded on an SD card. An operation manager and the driver can review the driving status, including the number of voiced alerts, and an image of the driver when the alert was triggered. This feature offers additional guidance for safe driving. The driver’s condition can also be notified to the operation manager in real-time, making it possible to caution the driver and take quick action in the event of an emergency. In addition, the monitor can be linked with a new telematics device for commercial vehicles released by Denso earlier this year and a cloud-based digital tachograph manufactured by Fujitsu Limited.
Denso has been developing safety technologies and products for passenger cars and commercial vehicles to help create a society free from traffic accidents. In 2014, the company developed its first driver status monitor and currently offers products for heavy trucks and large sightseeing buses. In 2017, DENSO collaborated with FotoNation, a wholly owned subsidiary of Xperi Corporation that creates facial image recognition and neural network technologies, to further improve the performance of driver status detection and accelerate the development of products that will impact the future of mobility.