Articles | Volume 13
https://doi.org/10.5194/ars-13-209-2015
https://doi.org/10.5194/ars-13-209-2015
03 Nov 2015
 | 03 Nov 2015

Multi-view point cloud fusion for LiDAR based cooperative environment detection

B. Jaehn, P. Lindner, and G. Wanielik

Related authors

Extending the vehicular network simulator Artery in order to generate synthetic data for collective perception
Christoph Allig and Gerd Wanielik
Adv. Radio Sci., 17, 189–196, https://doi.org/10.5194/ars-17-189-2019,https://doi.org/10.5194/ars-17-189-2019, 2019
Short summary
TOF-LIDAR signal processing using the CFAR detector
Takashi Ogawa and Gerd Wanielik
Adv. Radio Sci., 14, 161–167, https://doi.org/10.5194/ars-14-161-2016,https://doi.org/10.5194/ars-14-161-2016, 2016
Local Dynamic Map als modulares Software Framework für Fahrerassistenzsysteme
P. Reisdorf, A. Auerswald, and G. Wanielik
Adv. Radio Sci., 13, 203–207, https://doi.org/10.5194/ars-13-203-2015,https://doi.org/10.5194/ars-13-203-2015, 2015
Objektverfolgung bei einer vollständigen Fahrzeugumfeldüberwachung mittels Radar
M. Schuster, T. Pech, J. Reuter, and G. Wanielik
Adv. Radio Sci., 12, 155–159, https://doi.org/10.5194/ars-12-155-2014,https://doi.org/10.5194/ars-12-155-2014, 2014

Cited articles

Besl, P. and McKay, N. D.: A method for registration of 3-D shapes, Pattern Analysis and Machine Intelligence, IEEE Transactions, 14, 239–256, https://doi.org/10.1109/34.121791, 1992.
Chen, Y. and Medioni, G.: Object modeling by registration of multiple range images, in: Robotics and Automation, 1991, Proceedings, 1991 IEEE International Conference, Vol. 3, 2724–2729, https://doi.org/10.1109/ROBOT.1991.132043, 1991.
Gabriel, M.: LiDAR-Signaturberechnung von raeumlich ausgedehnten Zielen im Fahrzeugumfeld, Bachelor thesis, Chemnitz University of Technology, 2010.
Ibeo Automotive Systems GmbH: Operating Manual ibeo LUX® Laser scanner, ibeo Automobile Sensor GmbH, Merkurring 20, 22143 Hamburg, 2.5 Edn., 2008.
Jähn, B.: Fusion of multi-view point cloud data for cooperative object detection, Master thesis, Chemnitz University of Technology, 2014.
Download
Short summary
In the future autonomous robots will share their environment information captured by range sensors like LiDAR or ToF cameras. In this paper it is shown that a two dimensional position and heading information, e.g. obtained by GPS tracking methods, is enough to initialize a 3D registration method using the range images from different perspectives of different platforms (e.g. car & infrastructure). Thus they will be able to explore their surrounding in a cooperative manner.