Sensor Fusion (TSRT14)


Sensor Fusion is all about how to extract information from available sensors. The basis for this is estimation and filtering theory from statistics. Examples and applications studied focus on localization, either of the sensor platform (navigation) or other mobile objects (target tracking).

As an engineer with knowledge in sensor fusion you are very attractive on the work market. Each year we host numerous Master’s Thesis projects with students working at companies. The content of this course is inspired by the more than 150 Master’s Theses that we have examined the last few years.

By attending this course you will obtain a solid ground to stand on, which covers everything from the mathematical fundamentals to applications. You will get the chance to learn both how to approach problems from a theoretical point of view and solving practical problems in matlab. Laboratory exercises and computer exercises are central in the course, and you get to use a computer during the exam.

Course Content

Applications

Laboratory Work

Localization in Acoustic Sensor Networks

A ground vehicle is moving autonomously, and is observed by a network of microphones. Your task is to detect the sound of the vehicle in the background noise, localize it, and finally predict its motions.

Orientation Estimation Using Smartphone Sensors

A modern smartphone contains a large number of sensors which can be accessed by the user. They are, among other things, used to estimate the orientation of the phone. Your task is to design and implement a filter that estimates the phone orientation, and to handle naturally occurring disturbances.