AI Sensor Fusion Algorithm


Target Specifications:

[1] Object fusion rate: over 95% [2] The average distance error between the fusion coordinate point and the lidar coordinate point is within 0.6m


AI Sensor Fusion HIL(Hardware in the Loop) testing platform

This real vehicle verification platform can verify:

(1) AI sensor fusion performance for ACC/ AEB

(2) AI lane markings for LKA/LFS


Camera-based recognition system – front ADAS for LSS (Lane support system)


Applied to the front ADAS above the level 2 of the autonomous driving vehicle with LSS (Lane support System) application, i.g. : FCW, LDW, PDW, HMW, SCW, FMW, LSS, AEB support.


Our feature of the new camera- based edge AI recognition application purpose platform.

  1. Use high-availability microchips with minimal delay to process large amounts of data in real time, and perform “multi-task” video recognition operations “simultaneously” under limited hardware computing resources.
  2. Meet the automotive requirements of embedded safety system “power saving, low power consumption, fast transmission, immediacy, reliability and privacy”.
  3. Compatible with the input/output interface formats (TVI/AHD/CVBS/Serdes/Ethernet) commonly used in vehicle camera-based safety systems to achieve the goal of recognition with various cameras.
  4. Covers the common transmission interface GPIO/CANBUS/RS232/RS485 in the car, and outputs the object type and event status in real time, it can interact with other systems through in-vehicle communication, and can also connect to cloud for big data collection system through the Internet.
  5. Can use WIFI or 4G communication, OTA online software update.
  6. The connection with our existing mass production system (such as FCWS/LDWS/PDW/DMS/BSIS, etc.) has been completed, seamless servicing our existing customers.
  7. It has passed the automotive level of regulation reliability test specifications (ISO 7637, ISO 16750, ISO 11452-4, EMC electromagnetic compatibility, electrostatic protection).