Home » Node » 6918

Robust Multiple-Sensing-Modality Data Fusion for Reliable Perception

Speaker: 
Marcos Paul Gerardo Castro
Data dell'evento: 
Giovedì, 24 July, 2014 - 11:00
Luogo: 
Aula Magna DIS

The ability to build high-fidelity 3D representations of the environment from sensor data is critical for autonomous robots. Multi-sensor data fusion allows for more complete and accurate representations. Furthermore, using distinct sensing modalities (i.e. sensors using a different physical process and/or operating at different electromagnetic frequencies) usually leads to more reliable perception, especially in challenging environments, as modalities may complement each other. However, they may react differently to certain materials or environmental conditions, leading to catastrophic fusion.

In this presentation, we propose a new method to reliably fuse data from multiple sensing modalities, including in situations where they detect different targets. We first compute distinct continuous surface representations for each sensing modality, with uncertainty, using Gaussian Process Implicit Surfaces (GPIS). Second, we perform a local consistency test between these representations, to separate consistent data (i.e. data corresponding to the detection of the same target by the sensors) from inconsistent data. The consistent data can then be fused together, using another GPIS process, and the rest of the data can be combined as appropriate.

We will show that in challenging environmental conditions, where differences of perception between distinct sensing modalities are common, our proposed method with integrated consistency test avoids catastrophic fusion and thereby highly improves object representations, both in terms of accuracy and certainty.

© Università degli Studi di Roma "La Sapienza" - Piazzale Aldo Moro 5, 00185 Roma