ContrastSense: domain-invariant contrastive learning for in-the-wild wearable sensing

Existing wearable sensing models often struggle with domain shifts and class label scarcity. Contrastive learning is a promising technique to address class label scarcity, which however captures domain-related features and suffers from low-quality negatives. To address both problems, we propose Cont...

Full description

Saved in:
Bibliographic Details
Main Authors: Dai, Gaole, Xu, Huatao, Yoon, Hyungjun, Li, Mo, Tan, Rui, Lee, Sung-Ju
Other Authors: College of Computing and Data Science
Format: Article
Language:English
Published: 2025
Subjects:
Online Access:https://hdl.handle.net/10356/182272
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-182272
record_format dspace
spelling sg-ntu-dr.10356-1822722025-01-20T06:26:35Z ContrastSense: domain-invariant contrastive learning for in-the-wild wearable sensing Dai, Gaole Xu, Huatao Yoon, Hyungjun Li, Mo Tan, Rui Lee, Sung-Ju College of Computing and Data Science Computer and Information Science Wearable Sensing Contrastive Learning Existing wearable sensing models often struggle with domain shifts and class label scarcity. Contrastive learning is a promising technique to address class label scarcity, which however captures domain-related features and suffers from low-quality negatives. To address both problems, we propose ContrastSense, a domain-invariant contrastive learning scheme for a realistic wearable sensing scenario where domain shifts and class label scarcity are presented simultaneously. To capture domain-invariant information, ContrastSense exploits unlabeled data and domain labels specifying user IDs or devices to minimize the discrepancy across domains. To improve the quality of negatives, time and domain labels are leveraged to select samples and refine negatives. In addition, ContrastSense designs a parameter-wise penalty to preserve domaininvariant knowledge during fine-tuning to further maintain model robustness. Extensive experiments show that ContrastSense outperforms the state-of-the-art baselines by 8.9% on human activity recognition with inertial measurement units and 5.6% on gesture recognition with electromyography when presented with domain shifts across users. Besides, when presented with different kinds of domain shifts across devices, on-body positions, and datasets, ContrastSense achieves consistent improvements compared with the best baselines. Ministry of Education (MOE) Published version This research is supported in part by the Ministry of Education, Singapore, under its Academic Research Fund Tier 2 (MOE-T2EP20220-0004), and in part by Hong Kong GRF 16204224. 2025-01-20T06:26:35Z 2025-01-20T06:26:35Z 2024 Journal Article Dai, G., Xu, H., Yoon, H., Li, M., Tan, R. & Lee, S. (2024). ContrastSense: domain-invariant contrastive learning for in-the-wild wearable sensing. Proceedings of the ACM On Interactive, Mobile, Wearable and Ubiquitous Technologies, 8(4), 3699744-. https://dx.doi.org/10.1145/3699744 2474-9567 https://hdl.handle.net/10356/182272 10.1145/3699744 2-s2.0-85210121825 4 8 3699744 en MOE-T2EP20220-0004 Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies © 2024 the Owner/Author(s). This work is licensed under a Creative Commons Attribution International 4.0 License. application/pdf
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Computer and Information Science
Wearable Sensing
Contrastive Learning
spellingShingle Computer and Information Science
Wearable Sensing
Contrastive Learning
Dai, Gaole
Xu, Huatao
Yoon, Hyungjun
Li, Mo
Tan, Rui
Lee, Sung-Ju
ContrastSense: domain-invariant contrastive learning for in-the-wild wearable sensing
description Existing wearable sensing models often struggle with domain shifts and class label scarcity. Contrastive learning is a promising technique to address class label scarcity, which however captures domain-related features and suffers from low-quality negatives. To address both problems, we propose ContrastSense, a domain-invariant contrastive learning scheme for a realistic wearable sensing scenario where domain shifts and class label scarcity are presented simultaneously. To capture domain-invariant information, ContrastSense exploits unlabeled data and domain labels specifying user IDs or devices to minimize the discrepancy across domains. To improve the quality of negatives, time and domain labels are leveraged to select samples and refine negatives. In addition, ContrastSense designs a parameter-wise penalty to preserve domaininvariant knowledge during fine-tuning to further maintain model robustness. Extensive experiments show that ContrastSense outperforms the state-of-the-art baselines by 8.9% on human activity recognition with inertial measurement units and 5.6% on gesture recognition with electromyography when presented with domain shifts across users. Besides, when presented with different kinds of domain shifts across devices, on-body positions, and datasets, ContrastSense achieves consistent improvements compared with the best baselines.
author2 College of Computing and Data Science
author_facet College of Computing and Data Science
Dai, Gaole
Xu, Huatao
Yoon, Hyungjun
Li, Mo
Tan, Rui
Lee, Sung-Ju
format Article
author Dai, Gaole
Xu, Huatao
Yoon, Hyungjun
Li, Mo
Tan, Rui
Lee, Sung-Ju
author_sort Dai, Gaole
title ContrastSense: domain-invariant contrastive learning for in-the-wild wearable sensing
title_short ContrastSense: domain-invariant contrastive learning for in-the-wild wearable sensing
title_full ContrastSense: domain-invariant contrastive learning for in-the-wild wearable sensing
title_fullStr ContrastSense: domain-invariant contrastive learning for in-the-wild wearable sensing
title_full_unstemmed ContrastSense: domain-invariant contrastive learning for in-the-wild wearable sensing
title_sort contrastsense: domain-invariant contrastive learning for in-the-wild wearable sensing
publishDate 2025
url https://hdl.handle.net/10356/182272
_version_ 1821833206767288320