Analyzing self-driving cars using Autoware and Carla Simulator
Basic Information
Project ID: AEE-2020-16
Students: Rasmus Kähärä, Paavo Lehtelä, Robin Nyman, Pontus Prinsén, Sopitta Thurachen
Project manager: Pontus Prinsén
Instructor: Eshagh Kargar
Other advisors:
Starting date: 16.1.2020
Completion date: 29.5.2020
Overview
Self-driving cars is a new field with vast potential, due to its potential increase in traffic safety and reduction in emissions. To this day there exists no complete solution on the market. Autoware provides a state-of-the-art open source platform for developing self-driving cars which is used by companies within the field. The objective of this project was to get familiar with an open-source platform for autonomous vehicles, which is called Autoware. To visualise everything we did we were using Carla simulator, which is a simulator created specifically for autonomous vehicle research. Furthermore, we were to create a machine learning based model for motion planning. After creation of the model, the performance of the model was to be compared to the Autoware algorithm. The motion planning algorithm in Autoware is currently implemented using a state-based approach, and our goal was to see how this compared against an approach based on machine learning.
Results
We created an algorithm using the gym-carla wrapper, which we used to run simulations in Carla. The algorithm we designed used reinforcement learning with a deep Q learning function approximator. The model was able to handle various different driving scenarios, such as various types of curves, roundabouts and obstacle detection. These scenarios were trained using a bird’s eye view of the environment and tested using both the bird’s eye view and the simulator. Reinforcement learning for motion planning shows potential, because the model is able to adapt to new situations which it has not been explicitly planned for without changing the software. This could allow an advantage to a state-based approach. It would however require further training and development before it can be deployed on hardware. In the video below, we recorded how our model behaved in a roundabout.