重庆理工大学学报(自然科学) ›› 2023, Vol. 37 ›› Issue (10): 81-88.

• 车辆工程 • 上一篇    下一篇

一种相机和激光雷达数据融合的目标检测算法

申彩英,朱思瑶,黄兴驰   

  1. 辽宁工业大学 汽车与交通工程学院,辽宁 锦州 121001
  • 出版日期:2023-11-20 发布日期:2023-11-20
  • 作者简介:申彩英,女,博士,副教授,主要从事智能驾驶汽车关键技术方面的研究,Email:894406103@qq.com;通信作者 朱思瑶,男,硕士研究生,主要从事智能驾驶汽车环境感知方面的研究,Email:1539151048@qq.com

A target detection algorithm based on camera and LiDAR data fusion

  • Online:2023-11-20 Published:2023-11-20

摘要: 环境感知是无人驾驶汽车的重要研究内容,交通参与者(如汽车、行人、骑行者)是 其重点检测的目标。为解决纯点云算法由于点云稀疏性在识别小目标(如行人、骑行者)精度 较低的问题,综合激光雷达和图像在目标识别上的优点,提出一种基于多传感器融合的目标检 测算法—PointPainting+。以 PointPainting算法为基础框架,改进其中语义分割环节即加入条形 池化,从而使算法对长条形物体有更好的识别能力。实验显示:改进后的算法相对于 PointPil lars基线算法,针对骑行者检测的平均精确度提高了 9.14%,行人检测方面的平均精度提高了 9.71%。检测速度能达到 43fps,满足实时性要求,有效改善了因点云稀疏性对行人、骑行者等 远距离小目标检测不佳的问题。

关键词: 无人驾驶汽车, 环境感知, 多模态融合, 机器视觉, 深度学习

Abstract: Environmental perception is an important part of self-driving cars and traffic participants (such as cars,pedestrians,cyclists) are the key detection targets.To solve the problem of low accuracy of lidar-only methods in identifying small targets (such as pedestrians and cyclists) due to the sparsity of point cloud,so combining the advantages of lidar and image in target recognition,a target detection algorithm based on multi-sensor fusion,PointPainting+,is proposed.This algorithm,building upon the framework of the PointPainting algorithm,enhances the semantic segmentation stage by incorporating a strip pooling module.This optimization enables the algorithm to achieve better recognition capabilities for long bar-shaped objects.The experimental results demonstrate that PointPainting+ algorithm,compared to the PointPillars baseline algorithm,exhibits an average accuracy improvement of 9.14% for cyclist detection and 9.71% for pedestrian detection.The detection speed can reach 43 frames per second,which meets the real-time requirement,and this algorithm effectively improves the problem of poor detection of long-distance and small targets such as pedestrians and cyclists due to the sparse point cloud.

中图分类号: 

  • U463.6