Workspace Navigation

Perception platform for autonomous vehicles

Basic information

Project ID: AEE-2017-16

Students: Ville Kukkonen, Amin Modabberian, Pietu-Pekka Roisko, Pyry Viita-aho, Ilmari Vikström

Project manager: Ville Kukkonen

Instructor: Arto Visala

Other advisors: Andrei Sandru

Starting date: 5.1.2017

Completion date: 22.5.2017


The objective of the project was to further develop the perception capabilities of an all-terrain vehicle fitted with a LiDAR, an omnidirectional camera and optical encoders for wheels and steering wheel. The basic groundwork of communicating with most of the sensors via ROS had been done in previous projects, and our task was to do the missing integration and calibration, as well as to lay the foundations for autonomous capabilities such as path planning and obstacle avoidance.

Summary of results

The omnidirectional camera was integrated to ROS, and the spherical image it produced was transformed to panoramic form. A kinematic model was implemented, utilizing the wheel and steering encoders, and combined with results from the INS to provide fused localization information. A SLAM solution was implemented and calibrated, using the LiDAR and the fused localization.



Final report

  • No labels
  File Modified
JPEG File Clustering.jpeg Jun 30, 2017 by Ville Kukkonen
PDF File Final_report_16_2017.pdf Jun 30, 2017 by Ville Kukkonen
PDF File Poster_AEE_2017_16.pdf May 14, 2017 by Timo Oksanen