Testing Two Tools for Multimodal Navigation
The latest smartphones with GPS, electronic compasses, directional audio, touch screens, and so forth, hold a potential for location-based services that are easier to use and that let users focus on their activities and the environment around them. Rather than interpreting maps, users can search for...
Saved in:
| Main Authors: | , , , , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Wiley
2012-01-01
|
| Series: | Advances in Human-Computer Interaction |
| Online Access: | http://dx.doi.org/10.1155/2012/251384 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
| _version_ | 1849306087609597952 |
|---|---|
| author | Mats Liljedahl Stefan Lindberg Katarina Delsing Mikko Polojärvi Timo Saloranta Ismo Alakärppä |
| author_facet | Mats Liljedahl Stefan Lindberg Katarina Delsing Mikko Polojärvi Timo Saloranta Ismo Alakärppä |
| author_sort | Mats Liljedahl |
| collection | DOAJ |
| description | The latest smartphones with GPS, electronic compasses, directional audio, touch screens, and so forth, hold a potential for location-based services that are easier to use and that let users focus on their activities and the environment around them. Rather than interpreting maps, users can search for information by pointing in a direction and database queries can be created from GPS location and compass data. Users can also get guidance to locations through point and sweep gestures, spatial sound, and simple graphics. This paper describes two studies testing two applications with multimodal user interfaces for navigation and information retrieval. The applications allow users to search for information and get navigation support using combinations of point and sweep gestures, nonspeech audio, graphics, and text. Tests show that users appreciated both applications for their ease of use and for allowing users to interact directly with the surrounding environment. |
| format | Article |
| id | doaj-art-2d050f185e6d45418f8e5a2265bcb741 |
| institution | Kabale University |
| issn | 1687-5893 1687-5907 |
| language | English |
| publishDate | 2012-01-01 |
| publisher | Wiley |
| record_format | Article |
| series | Advances in Human-Computer Interaction |
| spelling | doaj-art-2d050f185e6d45418f8e5a2265bcb7412025-08-20T03:55:12ZengWileyAdvances in Human-Computer Interaction1687-58931687-59072012-01-01201210.1155/2012/251384251384Testing Two Tools for Multimodal NavigationMats Liljedahl0Stefan Lindberg1Katarina Delsing2Mikko Polojärvi3Timo Saloranta4Ismo Alakärppä5The Interactive Institute, Acusticum 4, 941 28 Piteå, SwedenThe Interactive Institute, Acusticum 4, 941 28 Piteå, SwedenThe Interactive Institute, Acusticum 4, 941 28 Piteå, SwedenUniversity of Oulu, PL 8000, Oulun Yliopisto, 90014 Oulu, FinlandUniversity of Oulu, PL 8000, Oulun Yliopisto, 90014 Oulu, FinlandUniversity of Lapland, P.O. Box 122, 96101 Rovaniemi, FinlandThe latest smartphones with GPS, electronic compasses, directional audio, touch screens, and so forth, hold a potential for location-based services that are easier to use and that let users focus on their activities and the environment around them. Rather than interpreting maps, users can search for information by pointing in a direction and database queries can be created from GPS location and compass data. Users can also get guidance to locations through point and sweep gestures, spatial sound, and simple graphics. This paper describes two studies testing two applications with multimodal user interfaces for navigation and information retrieval. The applications allow users to search for information and get navigation support using combinations of point and sweep gestures, nonspeech audio, graphics, and text. Tests show that users appreciated both applications for their ease of use and for allowing users to interact directly with the surrounding environment.http://dx.doi.org/10.1155/2012/251384 |
| spellingShingle | Mats Liljedahl Stefan Lindberg Katarina Delsing Mikko Polojärvi Timo Saloranta Ismo Alakärppä Testing Two Tools for Multimodal Navigation Advances in Human-Computer Interaction |
| title | Testing Two Tools for Multimodal Navigation |
| title_full | Testing Two Tools for Multimodal Navigation |
| title_fullStr | Testing Two Tools for Multimodal Navigation |
| title_full_unstemmed | Testing Two Tools for Multimodal Navigation |
| title_short | Testing Two Tools for Multimodal Navigation |
| title_sort | testing two tools for multimodal navigation |
| url | http://dx.doi.org/10.1155/2012/251384 |
| work_keys_str_mv | AT matsliljedahl testingtwotoolsformultimodalnavigation AT stefanlindberg testingtwotoolsformultimodalnavigation AT katarinadelsing testingtwotoolsformultimodalnavigation AT mikkopolojarvi testingtwotoolsformultimodalnavigation AT timosaloranta testingtwotoolsformultimodalnavigation AT ismoalakarppa testingtwotoolsformultimodalnavigation |