Case ID: M23-200P^

Published: 2024-04-10 08:21:41

Last Updated: 1712737301


Inventor(s)

Edward Andert
Aviral Shrivastava

Technology categories

Applied TechnologiesManufacturing/Construction/MechanicalPhysical ScienceWireless & Networking

Licensing Contacts

Physical Sciences Team

Accurate Cooperative Sensor Fusion by Parameterized Covariance Generation for Sensing and Localization Pipelines in CAVs

Background

Cooperative sensing is used to mitigate sensor coverage and obstruction issues in autonomous vehicles. Cooperative sensing happens when multiple connected autonomous vehicles (CAVs) combine data to gain a more accurate picture of the world around each individual CAV. Additional connected infrastructure sensors (CISs) placed throughout a region (e.g., traffic cameras) can be used to gather more data for the cooperative sensor fusion and strengthen the robustness.

One major challenge in cooperative sensing is the need to weight the measurements taken from the various sources to get an accurate result, and the weights should ideally be inversely proportional to the error in sensing information. However, previous cooperative sensor fusion approaches for autonomous vehicles use a fixed error model, where the covariance of a sensor and its recognizer pipeline is the mean of the measured covariance for all sensing scenarios.

Invention Description

Researchers at Arizona State University have developed a novel method for cooperative sensor fusion across multiple robots, connected autonomous vehicles (CAVs), or other mounted and moving sensors. This invention estimates error using key predictor terms that have high correlation with sensing and localization accuracy for precise covariance estimation of each sensor observation. This invention utilizes a low overhead method of transmitting data along with the relevant accuracy parameters so that sensor fusion can be performed with significant precision improvement over current methods. 

Potential Applications

  • Autonomous vehicles in high-risk & crowded driving areas (e.g., traffic lights, blind corners, shipping yards)
  • Multi-robot warehouse operations

Benefits and Advantages

  • Small data transmission sizes – only transmitting recognized data sets including positions and bounding boxes, not raw sensor data
  • Higher precision – greater estimation of expected error of sensor detections
  • Better local fusion – can be used to tune the precision of sensor recognition pipelines and improve fusion with other sensors onboard the CAV

Related Publication: Accurate Cooperative Sensor Fusion by Parameterized Covariance Generation for Sensing and Localization Pipelines in CAVs.