Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
10.1145/1753326.1753671acmconferencesArticle/Chapter ViewAbstractPublication PageschiConference Proceedingsconference-collections
research-article

Touch projector: mobile interaction through video

Published: 10 April 2010 Publication History

Abstract

In 1992, Tani et al. proposed remotely operating machines in a factory by manipulating a live video image on a computer screen. In this paper we revisit this metaphor and investigate its suitability for mobile use. We present Touch Projector, a system that enables users to interact with remote screens through a live video image on their mobile device. The handheld device tracks itself with respect to the surrounding displays. Touch on the video image is "projected" onto the target display in view, as if it had occurred there. This literal adaptation of Tani's idea, however, fails because handheld video does not offer enough stability and control to enable precise manipulation. We address this with a series of improvements, including zooming and freezing the video image. In a user study, participants selected targets and dragged targets between displays using the literal and three improved versions. We found that participants achieved highest performance with automatic zooming and temporary image freezing.

Supplementary Material

JPG File (p2287.jpg)
WMV File (p2287.wmv)

References

[1]
Balakrishnan, R., and Baudisch, P. (2009). Special Issue on Ubiquitous Multi-Display Environments. HCI Journal 24, 1 & 2.
[2]
Ballagas, R., Rohs, M., and Sheridan, J.G. (2005). Sweep and point & shoot: phonecam-based interactions for large public displays. Ext. Abstracts CHI 2005, 1200--1203.
[3]
Ballagas, R., Borchers, J., Rohs, M., and Sheridan, J.G. (2006). The Smart Phone: a ubiquitous input device. IEEE Pervasive Computing 5, 1, 70--77.
[4]
Baudisch, P., Cutrell, E., Robbins, D., Czerwinski, M., Tandler, P., Bederson, B., and Zierlinger, A. (2003). Drag-and-pop and drag-and-pick: Techniques for accessing remote screen content on touch- and pen-operated systems. Proc. Interact 2003, 57--64.
[5]
Bier, E.A., Stone, M.C., Pier, K., Buxton, W., and DeRose, T.D. (1993). Toolglass and magic lenses: the see--through interface. Proc. SIGGRAPH 1993, 73--80.
[6]
Boring, S., Altendorfer, M., Broll, G., Hilliges, O., and Butz, A. (2007). Shoot & copy: phonecam-based information transfer from public displays onto mobile phones. Proc. Mobility 2007, 24--31.
[7]
Buxton, W., and Myers, B. (1986). A study in two-handed input. Proc. CHI 1986, 321--326.
[8]
Fitzmaurice, G.W. (1993). Situated information spaces and spatially aware palmtop computers. Communications of the ACM 36, 7, 39--49.
[9]
Forlines, C., Vogel, D., and Balakrishnan, R. (2006). HybridPointing: fluid switching between absolute and relative pointing with a direct input device. Proc. UIST 2006, 211--220.
[10]
Guimbretière, F., Martin, A., and Winograd, T. (2005) Benefits of merging command selection and direct manipulation. ACM Transactions on Computer-Human Interaction 12, 3, 460--476.
[11]
Hinckley, K., Ramos, G., Guimbretière, F., Baudisch, P., and Smith, M. (2004). Stitching: pen gestures that span multiple displays. Proc. AVI 2004, 23--31.
[12]
Johanson, B., Hutchins, G., Winograd, T., and Stone, M. (2002). PointRight: experience with flexible input redirection in interactive workspaces. Proc. UIST 2002, 227--234.
[13]
Kabbash, P., Buxton, W., Sellen, A. (1994). Two-handed input in a compound task. Proc. CHI 1994, 417--423.
[14]
Kato, H., and Billinghurst, M. (1999). Marker Tracking and HMD Calibration for a Video-Based Augmented Reality Conferencing System. Proc. IEEE and ACM International Workshop on Augmented Reality, 85--94.
[15]
Latulipe, C., Kaplan, C.S., Clarke, C.L.A. (2005). Bimanual and unimanual image alignment: an evaluation of mouse-based techniques. Proc. UIST 2005, 123--131.
[16]
Liao, C., Liu, Q., Kimber, D., Chiu, P., Foote, J., and Wilcox, L. (2003). Shared interactive video for teleconferencing. Proc. ACM Multimedia 2003, 546--554.
[17]
Myers, B., Bhatnagar, R., Nichols, J., Peck, C.H., Kong, D., Miller, R., and Long, A.C. (2002). Interacting at a distance: measuring the performance of laser pointers and other devices. Proc. CHI 2002, 33--40.
[18]
Nacenta, M.A., Sallam, S., Champoux, B., Subramanian, S., and Gutwin, C. (2006). Perspective cursor: perspective-based interaction for multi-display environments. Proc. CHI 2006, 289--298.
[19]
Nacenta, M.A., Gutwin, C., Aliakseyeu, D., and Subra-manian, S. (2009). There and back again: cross-display object movement in multi-display environments. HCI Journal 24, 1, 170--229.
[20]
Pears, N., Jackson, D., and Olivier, P. (2009). Smart phone interaction with registered displays. IEEE Computer 8, 2, 14--21.
[21]
Pierce, J.S., Forsberg, A.S., Conway, M.J., Hong, S., Zeleznik, R.C., and Mine, M.R. (1997). Image plane interaction techniques in 3D immersive environments. Proc. Symposium on Interactive 3D Graphics, 39--43.
[22]
Poupyrev, I., Billinghurst, M., Weghorst, S., and Ichikawa, T. (1996). The go-go interaction technique: non-linear mapping for direct manipulation in VR. Proc. UIST 1996, 79--80.
[23]
Rekimoto, J. (1997). Pick-and-drop: a direct manipulation technique for multiple computer environments. Proc. UIST 1997, 31--39.
[24]
Robertson, G., Czerwinski, M., Baudisch, P., Meyers, B., Robbins, D., Smith, G., and Tan, D. (2005). The large-display user experience. IEEE Computer Graphics and Applications, 25, 4, 44--51.
[25]
Rohs, M., Schöning, J., Raubal, M., Essl, G., and Krüger, A. (2007). Map navigation with mobile devices: virtual versus physical movement with and without visual context. Proc. ICMI 2007, 146--153.
[26]
Sakamoto, D., Honda, K., Inami, M., and Igarashi, T. (2009). Sketch and run: a stroke-based interface for home robots. Proc. CHI 2009, 197--200.
[27]
Sears, A., and Shneiderman, B. (1991). High precision touchscreens: design strategies and comparisons with a mouse. International Journal of Man-Machine Studies 34, 4, 593--613.
[28]
Seifried, T., Haller, M., Scott, S.D., Perteneder, C., Rendl, C., Sakamoto, D., Inami, M. (2009). CRISTAL: design and implementation of a remote control system based on a multi-touch display. Proc. ITS 2009, 33--40.
[29]
Shneiderman, B. (1983). Direct manipulation: a step beyond programming languages. IEEE Computer 16, 8, 57--69.
[30]
Shoemaker, G., Tang, A., and Booth, K.S. (2007). Shadow reaching: a new perspective on interaction for large displays. Proc. UIST 2007, 53--56.
[31]
Stoakley, R., Conway, M.J., and Pausch, R. (1995). Virtual reality on a WIM: interactive worlds in miniature. Proc. CHI 1995, 265--272.
[32]
Tan, D.S., Meyers, B., and Czerwinski, M. (2004). WinCuts: manipulating arbitrary window regions for more effective use of screen space. Ext. Abstracts CHI 2004, 1525--1528.
[33]
Tani, M., Yamaashi, K., Tanikoshi, K., Futakawa, M., and Tanifuji, S. (1992). Object-oriented video: interaction with real-world objects through live video. Proc. CHI 1992, 593--598.
[34]
Tsang, M., Fitzmaurice, G.W., Kurtenbach, G., Khan, A., Buxton, B. (2002). Boom chameleon: simultaneous capture of 3D viewpoint, voice and gesture annotations on a spatially-aware display. Proc. UIST 2002, 111--120.
[35]
Vogel, D., and Baudisch, P. (2007). Shift: a technique for operating pen-based interfaces using touch. Proc. CHI 2007, 657--666.
[36]
Wilson, A., and Shafer, S. (2003). XWand: UI for intelligent spaces. Proc. CHI 2003, 545--552.
[37]
Yee, K.-P. (2003). Peephole displays: pen interaction on spatially aware handheld computers. Proc. CHI'03, 1--8.

Cited By

View all
  • (2024)Public Security User InterfacesProceedings of the New Security Paradigms Workshop10.1145/3703465.3703470(56-70)Online publication date: 16-Sep-2024
  • (2023)Selecting Real-World Objects via User-Perspective Phone OcclusionProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580696(1-13)Online publication date: 19-Apr-2023
  • (2023)HandyCast: Phone-based Bimanual Input for Virtual Reality in Mobile and Space-Constrained Settings via Pose-and-Touch TransferProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580677(1-15)Online publication date: 19-Apr-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
CHI '10: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
April 2010
2690 pages
ISBN:9781605589299
DOI:10.1145/1753326
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 10 April 2010

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. augmented reality
  2. input device
  3. interaction techniques
  4. mobile device
  5. multi-display environments
  6. multi-touch

Qualifiers

  • Research-article

Conference

CHI '10
Sponsor:

Acceptance Rates

Overall Acceptance Rate 6,199 of 26,314 submissions, 24%

Upcoming Conference

CHI 2025
ACM CHI Conference on Human Factors in Computing Systems
April 26 - May 1, 2025
Yokohama , Japan

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)83
  • Downloads (Last 6 weeks)7
Reflects downloads up to 25 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Public Security User InterfacesProceedings of the New Security Paradigms Workshop10.1145/3703465.3703470(56-70)Online publication date: 16-Sep-2024
  • (2023)Selecting Real-World Objects via User-Perspective Phone OcclusionProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580696(1-13)Online publication date: 19-Apr-2023
  • (2023)HandyCast: Phone-based Bimanual Input for Virtual Reality in Mobile and Space-Constrained Settings via Pose-and-Touch TransferProceedings of the 2023 CHI Conference on Human Factors in Computing Systems10.1145/3544548.3580677(1-15)Online publication date: 19-Apr-2023
  • (2023)Challenges and Opportunities for Multi-Device Management in ClassroomsACM Transactions on Computer-Human Interaction10.1145/351902529:6(1-27)Online publication date: 11-Jan-2023
  • (2023)Mixed Reality Interaction TechniquesSpringer Handbook of Augmented Reality10.1007/978-3-030-67822-7_5(109-129)Online publication date: 1-Jan-2023
  • (2022)Understanding and Creating Spatial Interactions with Distant Displays Enabled by Unmodified Off-The-Shelf SmartphonesMultimodal Technologies and Interaction10.3390/mti61000946:10(94)Online publication date: 19-Oct-2022
  • (2022)AR Digital Workspace Using a Mobile DeviceProceedings of the 2022 ACM Symposium on Spatial User Interaction10.1145/3565970.3567690(1-2)Online publication date: 1-Dec-2022
  • (2022)Dually Noted: Layout-Aware Annotations with Smartphone Augmented RealityProceedings of the 2022 CHI Conference on Human Factors in Computing Systems10.1145/3491102.3502026(1-15)Online publication date: 29-Apr-2022
  • (2022)CleAR Sight: Exploring the Potential of Interacting with Transparent Tablets in Augmented Reality2022 IEEE International Symposium on Mixed and Augmented Reality (ISMAR)10.1109/ISMAR55827.2022.00034(196-205)Online publication date: Oct-2022
  • (2022)Hierarchical Pointing on Distant Displays with Smart DevicesInternational Journal of Human–Computer Interaction10.1080/10447318.2022.210855939:19(3859-3874)Online publication date: 6-Sep-2022
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media