Understanding Spatiotemporal-Aware Multimodal Conversational Search in the Outdoor Urban Space
| dc.contributor.author | Xu, Jiangnan | |
| dc.contributor.author | Seo, Suyeon | |
| dc.contributor.author | Salminen, Joni | |
| dc.contributor.author | Saker, Michael | |
| dc.contributor.author | Shin, Joongi | |
| dc.contributor.author | Chamberlain, Alan | |
| dc.contributor.author | Papangelis, Konstantinos | |
| dc.contributor.author | Kim, Dae Hyun | |
| dc.contributor.department | fi=Ei alustaa|en=No platform| | |
| dc.contributor.editor | Oliver, Nuria | |
| dc.contributor.editor | Shamma, David A. | |
| dc.contributor.editor | Candello, Heloisa | |
| dc.contributor.editor | Cesar, Pablo | |
| dc.contributor.editor | Lopes, Pedro | |
| dc.contributor.editor | Bozzon, Alessandro | |
| dc.contributor.editor | Kosch, Thomas | |
| dc.contributor.editor | Liao, Vera | |
| dc.contributor.editor | Ma, Xiaojuan | |
| dc.contributor.editor | Artizzu, Valentino | |
| dc.contributor.editor | Draxler, Fiona | |
| dc.contributor.editor | López, Gustavo | |
| dc.contributor.editor | Reinschluessel, Anke V. | |
| dc.contributor.editor | Tong, Xin | |
| dc.contributor.editor | Toups Dugas, Phoebe O. | |
| dc.contributor.orcid | https://orcid.org/0000-0003-3230-0561 | |
| dc.date.accessioned | 2026-04-22T09:50:00Z | |
| dc.date.issued | 2026 | |
| dc.description.abstract | Emerging multimodal conversational search (MCS) tools (e.g., Gemini Live) allow users to search for spatiotemporal information through natural language dialogues as they move through urban space. Despite the growing popularity of these tools, there is limited understanding of how people engage with this technology. To address this gap, we developed UrbanSearch, an MCS technology probe designed to capture the user’s current geolocation, time, and visual surroundings. A contextual inquiry (N=23) revealed that MCS tools provide two core values: requiring low effort in forming queries while offering highly relevant responses, and functioning as a central information gateway. As a promising technology, MCS supports environmental learning, in-situ decision making, and personalized navigation. Participants also revealed unmet needs for spatial reasoning and transparent integration of multi-source information, along with concerns related to peripheral awareness, social context, and personal space. Drawing from the findings, we discuss design implications for future MCS tools in urban spaces. | en |
| dc.description.notification | Copyright © 2026 Copyright held by the owner/author(s). This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 4.0 International License. | |
| dc.description.reviewstatus | fi=vertaisarvioitu|en=peerReviewed| | |
| dc.format.pagerange | 1-20 | |
| dc.identifier.isbn | 979-8-4007-2278-3 | |
| dc.identifier.uri | https://osuva.uwasa.fi/handle/11111/20161 | |
| dc.identifier.urn | URN:NBN:fi-fe2026042232048 | |
| dc.language.iso | en | |
| dc.publisher | ACM | |
| dc.relation.conference | ACM SIGCHI annual conference on human factors in computing systems | |
| dc.relation.doi | https://doi.org/10.1145/3772318.3790541 | |
| dc.relation.ispartof | CHI '26: Proceedings of the 2026 CHI Conference on Human Factors in Computing Systems | |
| dc.relation.url | https://doi.org/10.1145/3772318.3790541 | |
| dc.relation.url | https://urn.fi/URN:NBN:fi-fe2026042232048 | |
| dc.rights | https://creativecommons.org/licenses/by-nc-nd/4.0/ | |
| dc.source.identifier | 0c30cf40-b37b-47b4-8bc5-d3a5b5736901 | |
| dc.source.metadata | SoleCRIS | |
| dc.subject | Urban Space | |
| dc.subject | Conversational Search | |
| dc.subject | Contextual Inquiry | |
| dc.subject.discipline | fi=Markkinointi|en=Marketing| | |
| dc.title | Understanding Spatiotemporal-Aware Multimodal Conversational Search in the Outdoor Urban Space | |
| dc.type.okm | fi=A4 Vertaisarvioitu artikkeli konferenssijulkaisussa|en=A4 Article in conference proceedings (peer-reviewed)| | |
| dc.type.publication | article | |
| dc.type.version | publishedVersion |
Tiedostot
1 - 1 / 1
