[공지사항]을 빙자한 안부와 근황 
Show more

Daily Arxiv

This is a page that curates AI-related papers published worldwide.
All content here is summarized using Google Gemini and operated on a non-profit basis.
Copyright for each paper belongs to the authors and their institutions; please make sure to credit the source when sharing.

Augmenting End-to-End Steering Angle Prediction with CAN Bus Data

Created by
  • Haebom

Author

Amit Singh

Outline

This paper proposes a novel method to improve the end-to-end steering prediction accuracy of autonomous vehicles. Conventionally, sensor fusion methods using expensive LiDAR and radar sensors have been mainly used, but in this paper, we propose sensor fusion using cost-effective CAN bus data. CAN bus data contains information such as vehicle speed, steering angle, and acceleration, and can improve the accuracy of computer vision models by fusion with image data. Experimental results show that models using CAN bus data reduce RMSE by 20% compared to existing models, and some models show up to 80% reduction.

Takeaways, Limitations

Takeaways:
We present a novel method to improve steering prediction accuracy in autonomous vehicles by leveraging inexpensive CAN bus data.
This could make autonomous driving technology more economically accessible by reducing the need for expensive LiDAR and radar sensors.
We experimentally demonstrate that CAN bus data fusion can significantly reduce the prediction error of computer vision models.
Limitations:
The reliability and accuracy of the CAN bus data needs to be reviewed. Errors or omissions in the data can affect model performance.
Additional generalization performance evaluations for various vehicle types and driving environments are needed. Current experimental results may be limited to specific conditions.
Further research is needed to determine whether improved performance can be achieved by fusion with other sensor data besides CAN bus data.
👍