Thesis topics

So you’re thinking of making the culmination of your studies about self-driving?

Good, because we need all the help we can get. See all the available thesis topics and don’t hesitate to contact us if your idea for a self-driving topic is not listed!

Snow plow driver assistance

Bachelor or Masters
Tambet Matiisen
Edgar Sepp
Link

Curbs and road edges can be hard to detect for snow plow operators. Mistakes can be costly both in terms of repairing the road and machinery. We proposed an assistance system for snow plow operators that notifies if the plow is too close to the road edge. The system is based on centimeter-level accurate RTK GPS system and snow plow area maps from Tartu Geohub. The goal of this project is to validate this idea on a real snow plow machine.

Disengagement probability estimation

Testing of autonomous vehicles on public streets is performed with safety drivers. The task of a safety driver is to monitor the car behavior and take over the control if the car behaves erratically. This is called disengagement and all disengagements are meticulously logged and analyzed.

The task in this project is to estimate the probability of disengagement on a specific section of a road. From the individual disengagement probabilities of road sections a total disengagement probability for a certain route can be calculated. If the total probability of disengagement along a route is below a certain threshold, a driverless vehicle is dispatched. If not, then an autonomous vehicle with a safety driver is dispatched.

In theory calculating a disengagement probability for a road section is simple - just no_of_disengagements_in_this_section / no_of_times_the_vehicle_passed_this_section. But in practice the vehicle has passed this section maybe only a handful of times. The key idea is that we should group or cluster the road sections to increase the denominator.

Digital twin of an Estonian town

Bachelor or Masters
Tambet Matiisen
Allan Mitt
Link

Digital twin is a virtual copy of a town that can be used for varied purposes. Autonomous Driving Lab has created a digital twin of Tartu that is used for testing the autonomous car before letting it drive on public streets. The task in this project is to create a similar digital twin of an Estonian town of your choice. The methodology will be based on Allan Mitt’s master’s thesis.

Teleoperation situational awareness testing

Bachelor or Masters
Tambet Matiisen
Karl Kruusamäe
Link
Remote control a.k.a. teleoperation is used to assist self-driving cars in situations that they cannot handle yet, e.g. roadworks or accidents. Simple direct teleoperation means that the remote driver sees video feed from car cameras and can control the car using remote steering wheel and pedals, similar to racing games. We have set up such a teleoperation system for our lab car together with Estonian company Clevon. We want to evaluate the situational awareness of the remote driver and its ability to perform common maneuvers.

Evaluation of RTK base station accuracy

Autonomous vehicles need to know precisely their position in the world. If normal phone GPS gives 2-5 meter accuracy, the RTK positioning system in our lab car can achieve 2-10 centimeter accuracy. RTK relies on correction information from nearby base stations. Currently we are using the ESTPOS base station network provided by the Estonian Land Board (Maa-amet). We would like to evaluate how complicated it is to set up your own base station network and if its accuracy is comparable to the ESTPOS network.

Prototyping vision-based localization using milestone-board information

Context: Autonomous vehicles, like other robots, need to localize themselves in order to navigate. While satellite navigation systems (GNSS) such as GPS can provide such vehicles with localization information, the GNSS information might not always be available. One robust technique for vehicles to localize is using particle filters (e.g. [1], [2], [3]), given a map of the environment.

The project: During the project, you will prototype the use of milestone-board information for map-based localization. The investigation includes the following steps:
You will begin by first creating a scenario (i.e. as a Python or Matlab program, for example) that contains some highways, roads, cities, and virtual milestone boards.
You’ll then implement a particle filter (which is one of the most intuitive techniques for autonomous localization) for localization within the map (i.e. inside the created scenario). This also includes exploring multiple possible routes that lead to the same city.

Air-flow sensing for perception in autonomous driving

Bachelor or Masters
Naveed Muhammad
Link
Context: The usual sensing modalities that are available to autonomous vehicles i.e. vision, radar and lidar have their limitations. For example, none of them is very suitable for estimating the length of a leading vehicle. Additional sensing modalities, such as wind flow sensing, might have the potential to meaningfully complement the usual sensing modalities in autonomous driving.

Description: You will build upon the innovative work done on flow-sensing applications in autonomous driving by Roman Matvejev [1] and Matis Ottan [2]. It is possible to investigate one of the following avenues: (i) working in simulations and proposing new applications of flow sensing in autonomous driving, (ii) investigating new feature extraction and classification/regression methods etc., (iii) expanding out of simulations with physical validation of flow sensing in autonomous driving.

Vision-based localization on city scale using Open Street Map

Bachelor or Masters
Naveed Muhammad
Link
Autonomous vehicles, like other robots, need to localize themselves in order to navigate. While satellite navigation systems (GNSS) such as GPS can provide such vehicles with localization information, the GNSS information might not always be available. One robust technique for vehicles to localize is using particle filters (e.g. [1], [2], [3]), given a map of the environment.

During the project, you will investigate the use of vision sensing for map-based localization. The investigation includes the following steps:
1. Recognition of street names (and possibly also road direction signs) using vision
2. Matching of the perceived street-name information with streets in a given map (such as the Open Street Map)
3. Implementing a particle filter for localization within the map using the above-mentioned matches
4. Incorporation of the developed solution into the Autoware Mini software stack — https://adl.cs.ut.ee/lab/software

Map-based localization for autonomous vehicles using lidar

Bachelor or Masters
Naveed Muhammad
Link
Context: Autonomous vehicles, like other robots, need to localize themselves in order to navigate. While satellite navigation systems (GNSS) such as GPS can provide such vehicles with localization information, the GNSS information might not always be available. One robust technique for vehicles to localize is using particle filter [1, 2, 3], given a map of the environment. Recently, deep-learning approaches to map-based localization have also been proposed in the literature [4, 5].

Project: During the project, you will investigate lidar sensors for map-based localization. You can start by experimenting with lidar data acquired using the ADL Lexus vehicle and then move to real-time implementation on-board the vehicle (within our Autoware Mini software stack - https://adl.cs.ut.ee/lab/software). The investigation includes aspects such as:

(i) Creating a map, using data from the Estonian Land Board, and implementation of a map-based localization using particle filtering. The validation can be done in two ways:
- In the CARLA simulation of Tartu city.
- Using real lidar data from the streets of Tartu (we already have this data at the lab).

(ii) Using the above implementation, investigate aspects such as: what is the lower bound on how rich the lidar sensor data should be, in order to perform localization? The vehicle is equipped with multi-beam lidars from Velodyne and Ouster. So the natural question is if we need all the laser beams? Do we need 16, 8, or is even a single beam enough?

(iii) Incorporation of your proposed solution into Autoware Mini.