Conversational localization: Indoor human localization through intelligent conversation

We propose a novel sensorless approach to indoor localization by leveraging natural language conversations with users, which we call conversational localization. To show the feasibility of conversational localization, we develop a proof-of-concept system that guides users to describe their surroundi...

Full description

Saved in:
Bibliographic Details
Main Authors: SHESHADRI SMITHA, HARA, Kotaro
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2024
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/8654
https://ink.library.smu.edu.sg/context/sis_research/article/9657/viewcontent/3631404_pvoa_cc_by.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-9657
record_format dspace
spelling sg-smu-ink.sis_research-96572024-02-22T03:07:48Z Conversational localization: Indoor human localization through intelligent conversation SHESHADRI SMITHA, HARA, Kotaro We propose a novel sensorless approach to indoor localization by leveraging natural language conversations with users, which we call conversational localization. To show the feasibility of conversational localization, we develop a proof-of-concept system that guides users to describe their surroundings in a chat and estimates their position based on the information they provide. We devised a modular architecture for our system with four modules. First, we construct an entity database with available image-based floor maps. Second, we enable the dynamic identification and scoring of information provided by users through our utterance processing module. Then, we implement a conversational agent that can intelligently strategize and guide the interaction to elicit localizationally valuable information from users. Finally, we employ visibility catchment area and line-of-sight heuristics to generate spatial estimates for the user’s location. We conduct two user studies in designing and testing the system. We collect 800 natural language descriptions of unfamiliar indoor spaces in an online crowdsourcing study to learn the feasibility of extracting localizationally useful entities from user utterances. We then conduct a field study with 10 participants at 10 locations to evaluate the feasibility and performance of conversational localization. The results show that conversational localization can achieve within-10 meter localization accuracy at eight out of the ten study sites, showing the technique’s utility for classes of indoor location-based services. 2024-01-01T08:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/8654 info:doi/10.1145/3631404 https://ink.library.smu.edu.sg/context/sis_research/article/9657/viewcontent/3631404_pvoa_cc_by.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Artificial Intelligence and Robotics Software Engineering
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic Artificial Intelligence and Robotics
Software Engineering
spellingShingle Artificial Intelligence and Robotics
Software Engineering
SHESHADRI SMITHA,
HARA, Kotaro
Conversational localization: Indoor human localization through intelligent conversation
description We propose a novel sensorless approach to indoor localization by leveraging natural language conversations with users, which we call conversational localization. To show the feasibility of conversational localization, we develop a proof-of-concept system that guides users to describe their surroundings in a chat and estimates their position based on the information they provide. We devised a modular architecture for our system with four modules. First, we construct an entity database with available image-based floor maps. Second, we enable the dynamic identification and scoring of information provided by users through our utterance processing module. Then, we implement a conversational agent that can intelligently strategize and guide the interaction to elicit localizationally valuable information from users. Finally, we employ visibility catchment area and line-of-sight heuristics to generate spatial estimates for the user’s location. We conduct two user studies in designing and testing the system. We collect 800 natural language descriptions of unfamiliar indoor spaces in an online crowdsourcing study to learn the feasibility of extracting localizationally useful entities from user utterances. We then conduct a field study with 10 participants at 10 locations to evaluate the feasibility and performance of conversational localization. The results show that conversational localization can achieve within-10 meter localization accuracy at eight out of the ten study sites, showing the technique’s utility for classes of indoor location-based services.
format text
author SHESHADRI SMITHA,
HARA, Kotaro
author_facet SHESHADRI SMITHA,
HARA, Kotaro
author_sort SHESHADRI SMITHA,
title Conversational localization: Indoor human localization through intelligent conversation
title_short Conversational localization: Indoor human localization through intelligent conversation
title_full Conversational localization: Indoor human localization through intelligent conversation
title_fullStr Conversational localization: Indoor human localization through intelligent conversation
title_full_unstemmed Conversational localization: Indoor human localization through intelligent conversation
title_sort conversational localization: indoor human localization through intelligent conversation
publisher Institutional Knowledge at Singapore Management University
publishDate 2024
url https://ink.library.smu.edu.sg/sis_research/8654
https://ink.library.smu.edu.sg/context/sis_research/article/9657/viewcontent/3631404_pvoa_cc_by.pdf
_version_ 1794549705633431552