Facilitating Sensor Fusion for Autonomous Vehicles

NovAtel WebinarAutonomous vehicle applications rely on a range of sensors for positioning and navigation. Data fusion from these sensors is a highly complex operation that requires very sophisticated software stacks. Much research and development has already taken place in this realm so that modular solutions are ready to hand.

On October 27th, NovAtel is sponsoring a webinar focusing on exciting new developments in sensor fusion for autonomous vehicles. Register Here.

Fusion with Installed Sensors

Sensors already installed on most modern vehicles can be exploited via the CAN-bus for positioning. These sensors include low-resolution odometry (DMI) and consumer-grade IMUs currently used for dynamic stability control and wheel slip detection. A novel approach for combining vehicle speed, steering angles, transmission settings and multiple odometry inputs demonstrates achievable results while operating under a GNSS-denied environment. A 90% performance improvement compared to a standalone GNSS/INS solution shows promise for future production models.

Radar-based Parking

An integrated radar-based localization system that supports Level 4 autonomous driving performs well in automated parking inside covered parking garages. The system integrates automotive radars and dead reckoning technologies supported by high-definition (HD) maps to offer decimeter-level positioning accuracy.

What’s Out There

Several sources of applicable data and versatile software are available for use in autonomous vehicle applications, including full datasets, full software stacks, convolutional neural networks (CNNs) to be integrated into those stacks, data on which to train a new CNN, simulators, simulation environments, recorded data, annotated data, and more. For example, Autoware is a well-supported open source  project with many large companies involved in and collaborating on the efforts to improve and further develop the project.

Expert Panel

Our three expert panelists provide diverse, complementary perspectives on this dynamic, rewarding environment.

image 2Ryan Dixon is the Sensor Fusion and Autonomy Lead in NovAtel’s Applied Research group. In this role he is responsible for exploring sensor fusion methods and relating them to autonomy applications. Prior to this he was Chief Engineer of the SPAN GNSS/INS products group at NovAtel, responsible for the dedicated team maintaining and enhancing NovAtel’s inertial product portfolio.


imageAboelmagd Noureldin received bachelor’s and Masters degrees in engineering physics from Cairo University and a Ph.D. in electrical and computer engineering from The University of Calgary. He is currently a Cross-Appointment Professor with the Departments of Electrical and Computer Engineering, Royal Military College of Canada, Queen’s University, Kingston, Ontario, Canada.


image 1David Van Geyn is the Open Autonomy Engineering Manager in the Products and Services group at Hexagon | AutonomouStuff, based in Ottawa, Ontario, Canada. He manages a team of software engineers that work on multiple open-source autonomous vehicle stacks, customer deployments/projects using those stacks, and sensor drivers, as well as other products such as the AutonomouStuff Shuttle. David has a B.Cmp.H. and M.Sc. from Queen’s University at Kingston and over 10 years of experience in the industry, having worked on automotive software and research for autonomous vehicles.

The post Facilitating Sensor Fusion for Autonomous Vehicles appeared first on Inside Unmanned Systems.

Read the original article