Brainchip joins emotion3D for ADAS
emotion3D offers state-of-the-art computer vision and machine learning software for image-based analysis of in-cabin environments.
This analysis enables a comprehensive understanding of humans and objects inside a vehicle.
The partnership will allow emotion3D to leverage BrainChip’s technology to achieve an ultra-low-power working environment with on-chip learning while processing everything locally on device within the vehicle to ensure data privacy.
“We are committed to setting the standard in driving safety and user experience through the development of camera-based, in-cabin understanding,” says Florian Seitner, CEO at emotion3D, “by combining our in-cabin analysis software with BrainChip’s on-chip compute, we are able to elevate that standard in a faster, safer and smarter way. This partnership will provide a cascading number of benefits that will continue to disrupt the mobility industry.”
Among some of the situations covered by this optimised driver monitoring functionality are warnings for driver distractions and drowsiness, device personalization, gesture recognition, passenger detection, and more.
“Processing in-cabin data requires significant compute and associated power,” said Sean Hehir, BrainChip CEO. “By leveraging BrainChip’s Akida processor IP, emotion3D is able to improve intelligent safety and user experience functions by analysing the data in real time and forward inference data to the automobile’s central processor. Together, we improve the next generation of intelligent vehicles and give drivers a safer, enhanced user experience.