Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Sensor-fusion Location Tracking System using Hybrid Multimodal Deep Neural Network

Version 1 : Received: 16 September 2021 / Approved: 17 September 2021 / Online: 17 September 2021 (09:43:06 CEST)
Version 2 : Received: 13 October 2021 / Approved: 13 October 2021 / Online: 13 October 2021 (12:14:39 CEST)

A peer-reviewed article of this Preprint also exists.

Wei, X.; Wei, Z.; Radu, V. Sensor-Fusion for Smartphone Location Tracking Using Hybrid Multimodal Deep Neural Networks. Sensors 2021, 21, 7488. Wei, X.; Wei, Z.; Radu, V. Sensor-Fusion for Smartphone Location Tracking Using Hybrid Multimodal Deep Neural Networks. Sensors 2021, 21, 7488.

Abstract

Many engineered approaches have been proposed over the years for solving the hard problem of performing indoor localization. However, specialising solutions for the edge cases remains challenging. Here we propose to build the solution with zero hand-engineered features, but having everything learned directly from data. We use a modality specific neural architecture for extracting preliminary features, which are then integrated with cross-modality neural network structures. We show that each modality-specific neural architecture branch is capable of estimating the location with good accuracy independently. But for better accuracy a cross-modality neural network fusing the features of those early modality-specific representations is a better proposition. Our multimodal neural network, MM-Loc, is effective because it allows the uniform flow of gradients during training across modalities. Because it is a data driven approach, complex features representations are learned rather than relying heavily on hand-engineered features.

Keywords

Indoor Localization; Sensor Fusion; Multimodal Deep Neural Network; Multimodal Sensing; Wi-Fi Fingerprinting; Recurrent Neural Network

Subject

Computer Science and Mathematics, Computer Science

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.