Velocity-Based Channel Charting With Spatial Distribution Map Matching

Radio fingerprinting (FP) technologies improve localization performance in challenging non-line-of-sight environments. However, FP is expensive as its life cycle management requires recording reference signals for initial training and when the environment changes. Instead, novel channel charting tec...

Full description

Saved in:
Bibliographic Details
Main Authors: Maximilian Stahlke, George Yammine, Tobias Feigl, Bjoern M. Eskofier, Christopher Mutschler
Format: Article
Language:English
Published: IEEE 2024-01-01
Series:IEEE Journal of Indoor and Seamless Positioning and Navigation
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10591331/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Radio fingerprinting (FP) technologies improve localization performance in challenging non-line-of-sight environments. However, FP is expensive as its life cycle management requires recording reference signals for initial training and when the environment changes. Instead, novel channel charting technologies are significantly cheaper. Because they implicitly assign relative coordinates to radio signals, they require few reference coordinates for localization. However, even channel charting still requires data acquisition and reference signals, and its localization is slightly less accurate than FP. In this article, we propose a novel channel charting framework that does not require references and dramatically reduces life-cycle management. With velocity information, e.g., pedestrian dead reckoning or odometry, we model relative charts. And with topological map information, e.g., building floor plans, we transform them into real coordinates. In a large-scale study, we acquired two realistic datasets using 5G and single-input and multiple-output distributed radio systems with noisy velocities and coarse map information. Our experiments show that we achieve the localization accuracy of FP but without reference information.
ISSN:2832-7322