Fusing mobile, wearable and infrastructure sensing for immersive daily lifestyle analytics

With the prevalence of sensors in public infrastructure as well as in personal devices, exploitation of data from these sensors to monitor and profile basic activities (e.g., locomotive states such as walking, and gestural actions such as smoking) has gained popularity. Basic activities identified b...

Full description

Saved in:
Bibliographic Details
Main Author: SEN, Sougata
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2017
Subjects:
Online Access:https://ink.library.smu.edu.sg/etd_coll_all/23
https://ink.library.smu.edu.sg/cgi/viewcontent.cgi?article=1031&context=etd_coll_all
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.etd_coll_all-1031
record_format dspace
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic wearable computing
lifestyle monitoring
eating detection
shopping analytics
activity recognition
sensor fusion
Infrastructure
Programming Languages and Compilers
Software Engineering
spellingShingle wearable computing
lifestyle monitoring
eating detection
shopping analytics
activity recognition
sensor fusion
Infrastructure
Programming Languages and Compilers
Software Engineering
SEN, Sougata
Fusing mobile, wearable and infrastructure sensing for immersive daily lifestyle analytics
description With the prevalence of sensors in public infrastructure as well as in personal devices, exploitation of data from these sensors to monitor and profile basic activities (e.g., locomotive states such as walking, and gestural actions such as smoking) has gained popularity. Basic activities identified by these sensors will drive the next generation of lifestyle monitoring applications and services. To provide more advanced and personalized services, these next-generation systems will need to capture and understand increasingly finer-grained details of various common daily life activities. In this dissertation, I demonstrate the possibility of building systems using offthe- shelf devices, that not only identify activities, but also provide fine-grained details about an individual’s lifestyle, using a combination of multiple sensing modes. These systems utilise sensor data from personal as well as infrastructure devices to unobtrusively monitor the daily life activities. In this dissertation, I have used eating and shopping as two examples of daily life activities and have shown the possibility to monitor fine-grained details of these activities. Additionally, I have explored the possibility of utilising the sensor data to identify the cognitive state of an individual performing a daily life activity. I first investigate the possibility of using multiple sensor classes on wearable devices to capture novel context about common gesture-driven activities. More specifically, I describe Annapurna, a system which utilises the inertial and image sensors in a single device to identify fine-grained details of the eating activity. Annapurna utilises data from the inertial sensors of a smartwatch efficiently to determine when a person is eating. The inertial sensors opportunistically trigger the smartwatch’s camera to capture images of the food consumed, which is used in building a food journal. Annapurna has been subjected to multiple user studies and we found that the system can capture finer details about the eating activity – images of the food consumed, with false-positive & false-negative rates of 6.5% & 3.3% respectively. I next investigate the potential of combining sensing data from not just multiple personal devices, but also by using inexpensive ambient sensors/IoT platforms. More specifically, I describe I4S, a system utilises multiple sensor classes in multiple devices to identify fine-grained in-store activities of an individual shopper. The goal of I4S is to identify all the items that a customer in a retail store interacts with. I4S utilises the inertial sensor data from the smartwatch to identify the picking gesture as well as the shelf from where an item is picked. It utilises the BLE scan information from the customer’s smartphone to identify the rack from where the item is picked. By analysing the data collected through a user study involving 31 users, we found that we could identify pick gestures with a precision of over 92%, the rack where the pick occurred with an accuracy of over 86% and identify the position within a 1 meter wide rack with an accuracy of over 92%. Finally, I explore the possibility of using such finer-grained capture of an individual’s physical activities to infer higher-level, cognitive characteristics associated with such daily life activities. As an exemplar, I describe CROSDAC, a technique to identify the cognitive state and behavior of an individual during the shopping activity. To determine the shopper’s behavior, CROSDAC analyses the shopper’s trajectory in a store as well as the physical activities performed by the shopper. Using an unsupervised approach, CROSDAC first discovers clusters (i.e., implicitly uncovering distinct shopping styles) from limited training data, and then builds a cluster-specific, but person-independent, classifier from the modest amount of training data available. Using data from two studies involving 52 users conducted in two diverse locations, we found that it is indeed possible to identify the cognitive state of the shoppers through the CROSDAC approach. Through these three systems and techniques, in this dissertation I demonstrate the possibility of utilising data from sensors embedded in one or more off-the-shelf devices to determine fine-grained insights about an individual’s lifestyle.
format text
author SEN, Sougata
author_facet SEN, Sougata
author_sort SEN, Sougata
title Fusing mobile, wearable and infrastructure sensing for immersive daily lifestyle analytics
title_short Fusing mobile, wearable and infrastructure sensing for immersive daily lifestyle analytics
title_full Fusing mobile, wearable and infrastructure sensing for immersive daily lifestyle analytics
title_fullStr Fusing mobile, wearable and infrastructure sensing for immersive daily lifestyle analytics
title_full_unstemmed Fusing mobile, wearable and infrastructure sensing for immersive daily lifestyle analytics
title_sort fusing mobile, wearable and infrastructure sensing for immersive daily lifestyle analytics
publisher Institutional Knowledge at Singapore Management University
publishDate 2017
url https://ink.library.smu.edu.sg/etd_coll_all/23
https://ink.library.smu.edu.sg/cgi/viewcontent.cgi?article=1031&context=etd_coll_all
_version_ 1712300801990328320
spelling sg-smu-ink.etd_coll_all-10312017-10-24T06:31:34Z Fusing mobile, wearable and infrastructure sensing for immersive daily lifestyle analytics SEN, Sougata With the prevalence of sensors in public infrastructure as well as in personal devices, exploitation of data from these sensors to monitor and profile basic activities (e.g., locomotive states such as walking, and gestural actions such as smoking) has gained popularity. Basic activities identified by these sensors will drive the next generation of lifestyle monitoring applications and services. To provide more advanced and personalized services, these next-generation systems will need to capture and understand increasingly finer-grained details of various common daily life activities. In this dissertation, I demonstrate the possibility of building systems using offthe- shelf devices, that not only identify activities, but also provide fine-grained details about an individual’s lifestyle, using a combination of multiple sensing modes. These systems utilise sensor data from personal as well as infrastructure devices to unobtrusively monitor the daily life activities. In this dissertation, I have used eating and shopping as two examples of daily life activities and have shown the possibility to monitor fine-grained details of these activities. Additionally, I have explored the possibility of utilising the sensor data to identify the cognitive state of an individual performing a daily life activity. I first investigate the possibility of using multiple sensor classes on wearable devices to capture novel context about common gesture-driven activities. More specifically, I describe Annapurna, a system which utilises the inertial and image sensors in a single device to identify fine-grained details of the eating activity. Annapurna utilises data from the inertial sensors of a smartwatch efficiently to determine when a person is eating. The inertial sensors opportunistically trigger the smartwatch’s camera to capture images of the food consumed, which is used in building a food journal. Annapurna has been subjected to multiple user studies and we found that the system can capture finer details about the eating activity – images of the food consumed, with false-positive & false-negative rates of 6.5% & 3.3% respectively. I next investigate the potential of combining sensing data from not just multiple personal devices, but also by using inexpensive ambient sensors/IoT platforms. More specifically, I describe I4S, a system utilises multiple sensor classes in multiple devices to identify fine-grained in-store activities of an individual shopper. The goal of I4S is to identify all the items that a customer in a retail store interacts with. I4S utilises the inertial sensor data from the smartwatch to identify the picking gesture as well as the shelf from where an item is picked. It utilises the BLE scan information from the customer’s smartphone to identify the rack from where the item is picked. By analysing the data collected through a user study involving 31 users, we found that we could identify pick gestures with a precision of over 92%, the rack where the pick occurred with an accuracy of over 86% and identify the position within a 1 meter wide rack with an accuracy of over 92%. Finally, I explore the possibility of using such finer-grained capture of an individual’s physical activities to infer higher-level, cognitive characteristics associated with such daily life activities. As an exemplar, I describe CROSDAC, a technique to identify the cognitive state and behavior of an individual during the shopping activity. To determine the shopper’s behavior, CROSDAC analyses the shopper’s trajectory in a store as well as the physical activities performed by the shopper. Using an unsupervised approach, CROSDAC first discovers clusters (i.e., implicitly uncovering distinct shopping styles) from limited training data, and then builds a cluster-specific, but person-independent, classifier from the modest amount of training data available. Using data from two studies involving 52 users conducted in two diverse locations, we found that it is indeed possible to identify the cognitive state of the shoppers through the CROSDAC approach. Through these three systems and techniques, in this dissertation I demonstrate the possibility of utilising data from sensors embedded in one or more off-the-shelf devices to determine fine-grained insights about an individual’s lifestyle. 2017-06-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/etd_coll_all/23 https://ink.library.smu.edu.sg/cgi/viewcontent.cgi?article=1031&context=etd_coll_all http://creativecommons.org/licenses/by-nc-nd/4.0/ Dissertations and Theses Collection eng Institutional Knowledge at Singapore Management University wearable computing lifestyle monitoring eating detection shopping analytics activity recognition sensor fusion Infrastructure Programming Languages and Compilers Software Engineering