Wi-fi and radar fusion for head movement sensing through walls leveraging deep learning

Hameed, H., Tahir, A., Usman, M., Zhu, J., Lubna, , Abbas, H. , Ramzan, N., Cui, T. J., Imran, M. A. and Abbasi, Q. (2023) Wi-fi and radar fusion for head movement sensing through walls leveraging deep learning. IEEE Sensors Journal, (doi: 10.1109/JSEN.2023.3337515) (Early Online Publication)

[img] Text
310006.pdf - Accepted Version
Available under License Creative Commons Attribution.

29MB

Abstract

The detection of head movement plays a crucial role in human-computer interaction systems. These systems depend on control signals to operate a range of assistive and augmented technologies, including wheelchairs for Quadriplegics, as well as virtual/augmented reality and assistive driving. Driver drowsiness detection and alert systems aided by head movement detection can prevent major accidents and save lives. Wearable devices, such as MagTrack consist of magnetic tags and magnetic eyeglasses clips and are intrusive. Vision-based systems suffer from ambient lighting, line of sight, and privacy issues. Contactless sensing has become an essential part of next-generation sensing and detection technologies. Wi-Fi and radar provide contactless sensing, however in assistive driving they need to be inside enclosures or dashboards, which for all practical purposes in this paper have been considered as through walls. In this study, we propose a contactless system to detect human head movement with and without walls. We used ultra-wideband(UWB) radar and Wi-Fi signals, leveraging machine and deep learning techniques. Our study analyzes the six common head gestures: right, left, up, and down movements. Time-frequency multi-resolution analysis based on wavelet scalograms is used to obtain features from channel state information values, along with spectrograms from radar signals for head movement detection. Feature fusion of both radar and Wi-Fi signals is performed with state-of-the-art deep learning models. A high classification accuracy of 83.33% and 91.8% is achieved overall with the fusion of VGG16 and InceptionV3 model features trained on radar and Wi-Fi time-frequency maps with and without the walls, respectively.

Item Type:Articles
Additional Information:This study received partial financial support from the Engineering and Physical Sciences Research Council (EPSRC) through grants: EP/T021063/1 (Q.H., M.I.), EP/T021020/1 (M.I.), and SAPHIRE2022 (Grant No: 2814).
Status:Early Online Publication
Refereed:Yes
Glasgow Author(s) Enlighten ID:Tahir, Dr Ahsen and Imran, Professor Muhammad and Usman, Dr Muhammad and Abbas, Dr Hasan and Abbasi, Professor Qammer and Hameed, Hira
Authors: Hameed, H., Tahir, A., Usman, M., Zhu, J., Lubna, , Abbas, H., Ramzan, N., Cui, T. J., Imran, M. A., and Abbasi, Q.
College/School:College of Science and Engineering > School of Engineering
College of Science and Engineering > School of Engineering > Autonomous Systems and Connectivity
College of Science and Engineering > School of Engineering > Electronics and Nanoscale Engineering
Journal Name:IEEE Sensors Journal
Publisher:IEEE
ISSN:1530-437X
ISSN (Online):1558-1748
Published Online:05 December 2023
Copyright Holders:Copyright © 2023 IEEE
First Published:First published in IEEE Sensors Journal 2023
Publisher Policy:Reproduced in accordance with the publisher copyright policy

University Staff: Request a correction | Enlighten Editors: Update this record

Project CodeAward NoProject NamePrincipal InvestigatorFunder's NameFunder RefLead Dept
307829Quantum-Inspired Imaging for Remote Monitoring of Health & Disease in Community HealthcareJonathan CooperEngineering and Physical Sciences Research Council (EPSRC)EP/T021020/1ENG - Biomedical Engineering