Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article
Open access

User-centered design of an attitude-aware controller for ground reconnaissance robots

Published: 22 July 2015 Publication History
  • Get Citation Alerts
  • Abstract

    Warfighter safety can be significantly increased by offloading critical reconnaissance and surveillance missions to robotic assets. The subtleties of these tasks require significant operator involvement--usually carried out locally to the robot's deployment. Human soldiers use gestures to communicate movements and commands when engaged in this type of task. While considerable work has been done with robots visually observing humans to interpret their gestures, we propose a simpler, more field-appropriate system that allows robot operators to use their natural movements and gestures (via inertial measurement units [IMUs]) to teleoperate a robot while reducing the physical, as well as the cognitive, load on the soldier.
    This paper describes an operator control interface implemented on a smartphone, in contrast to the proprietary robot controllers typically used. The controller utilizes the device's IMUs, or attitude sensors, to bypass the touchscreen while accepting user input via gestures; this addresses a primary concern for gloved users in dirty environments where touchscreens lack reliability. We also propose that it provides a less visually-intense alternative for control, freeing up the soldier's cognitive load toward other functions.
    We present details of the attitude-based control software, as well as the design heuristics resulting from its iterative build-test-rebuild development. Additionally, results from a set of user studies are presented, showing that as a controller, this technique performs as well, or better than, other screen-based control systems, even when ignoring its advantages to gloved users. Twenty-five users were recruited to assess usability of these attitude-aware controls, testing their suitability for both driving and camera manipulation tasks. Participants drove a small tracked robot on an indoor course using the attitude-aware controller and a virtual [touchscreen] joystick, while metrics regarding performance, mental workload, and user satisfaction were collected. Results indicate that the tilt controller is preferred by 64% of users and performs equally as well, if not better to the alternative, on most performance metrics. These results support the development of a smartphone-based control option for military robotics, with a focus on more physical, attitude-based input methods that overcome deficiencies of current touch-based systems, namely lack of physical feedback, high attention demands, and unreliability in field environments.

    References

    [1]
    Antal, J. (2009). I fight the body electric! Military Technology, 33(7), 22 -- 30.
    [2]
    Apple. (2011, October). Core motion framework reference. Retrieved from http://developer.apple.com/library/ios/#documentation/CoreMotion/Reference/CoreMotion Reference/_index.html.
    [3]
    Apple. (2014, December). iPhone 6 technical specifications. Retrieved from www.apple.com/iphone-6/specs.
    [4]
    Applied Research Associates. (2011). Pointman: Purpose-built for tactical missions. Retrieved from http://www.ara.com/robotics/Small-Unmanned-Ground-Vehicle.html.
    [5]
    Army Science Board. (2002). Ad hoc study on human robot interface issues. Technical Report.
    [6]
    Bland, E. (2008, December). Wii-controlled robots made for combat. Discovery News, MSNBC.com.
    [7]
    Blitch, J. G. (2003). Adaptive mobility for rescue robots. In E. M. Carapezza (Ed.), Proceedings of SPIE, Sensors and Command, Control, Communications and Intelligence (C3I) Technologies for Homeland Defense and Law Enforcement II (Vol. 5071).
    [8]
    Brooke, J. (1996). SUS: A quick and dirty usability scale. In P. Jordan, B. Thomas, B. Weerdmeester, & I. McClelland (Eds.), . Taylor and Francis.
    [9]
    Carlson, J., & Murphy, R. R. (2005). How UGVs physically fail in the field. In IEEE Transactions on Robotics (Vol. 21, p. 423 -- 437).
    [10]
    Center for Army Lessons Learned, Task Force Devil. (2003, May). The modern warrior's combat load: Dismounted operations in Afghanistan (April-May 2003).
    [11]
    Chen, J. Y. C. (2010). UAV-guided navigation for ground robot tele-operation in a military reconnaissance environment. Ergonomics, 53(8), pp. 940--950.
    [12]
    Chen, J. Y. C., Haas, E. C., & Barnes, M. J. (2007, November). Human performance issues and user interface design for teleoperated robots. In IEEE Transactions on Systems, Man, and Cybernetics--Part C: Applications and Reviews (Vol. 37).
    [13]
    Chen, T., Yesilada, Y., & Harper, S. (2010). What input errors do you experience? Typing and pointing errors of mobile web users. International Journal of Human Computer Studies, 68, 138--157.
    [14]
    Chong, V. V.-K. (2000). Heuristics for mitigating mode confusion in digital cameras (Unpublished master's thesis). University of Victoria, Victoria, British Columbia.
    [15]
    Cockburn, A. (1997a, September-October). Structuring use cases with goals: Part 1. Journal of Object Oriented Programming, 35--40.
    [16]
    Cockburn, A. (1997b, November-December). Structuring use cases with goals: Part 2. Journal of Object Oriented Programming, 56--62.
    [17]
    Conway, J., & Hillegass, A. (2010). iPhone programming: The big nerd ranch guide. Indianapolis, IN: Pearson Technology Group.
    [18]
    Dension. (2011). WiRC user's manual v2.0 (v2.0 ed.). Retrieved from http://www.wirc.dension.com/support.
    [19]
    Department of the Army. (1990, June). Field manual 21-18: Footmarches. U.S. Department of the Army, USA.
    [20]
    Department of the Army. (2013, June). TRADOC regulation 71-20. U.S. Department of the Army, USA.
    [21]
    Donmez, B., Pina, P. E., & Cummings, M. L. (2008). Evaluation criteria for human-automation performance metrics. In Proceedings of the Performance Metrics for Intelligent Systems Workshop.
    [22]
    Erwin, S. I. (2001, October). Army's future tactical net apt for high-speed combat. National Defense. Retrieved from http://www.nationaldefensemagazine.org/archive/2001/October/Pages/Armys_Future6935.aspx.
    [23]
    Fong, T., & Thorpe, C. (2001). Vehicle teleoperation interfaces. Autonomous Robots, 11, 9--18.
    [24]
    Fong, T., Thorpe, C., & Baur, C. (2003). Multi-robot remote driving with collaborative control. In IEEE Transactions on Industrial Electronics (Vol. 50, p. 699 -- 704).
    [25]
    Fung, N. (2011). Light weight, portable operator control unit using an Android-enabled mobile phone. In D. W. Gage & C. M. Shoemaker (Eds.), Unmanned systems technology XIII, Proceedings of SPIE (Vol. 8045).
    [26]
    Hart, S., & Staveland, L. (1988). Development of NASA-TLX (Task load index: Results of empirical and theoretical research. In P. Hancock & N. Meshkati (Eds.), Human mental workload (p. 139--183). Elsevier.
    [27]
    Hillegass, A. (2011). Objective-c programming: The big nerd ranch guide. Indianapolis, IN: Pearson Technology Group.
    [28]
    Hinckley, K., Pierce, J., Sinclair, M., & Horvitz, E. (2000). Sensing Techniques for Mobile Interaction. In Symposium on User Interface Software and Technology, CHI Letters (Vol. 2, p. 91--100).
    [29]
    Human Performance Research Group, NASA. (1988). NASA task load index (TLX) v 1.0: Pen and pencil package. Retrieved from http://humansystems.arc.nasa.gov/groups/TLX/paperpencil.html.
    [30]
    International Organization for Standardization (ISO). (2008). Ergonomics of human-system interaction - multiple parts (Norm No. ISO 9241). Retrieved from http://www.iso.org/iso/home/store/catalogue_tc/catalogue_tc_browse.htm?commid=533N72. Geneva, Switzerland: ISO.
    [31]
    iRobot Corporation. (2012). iRobot 110 FirstLook. Retrieved from http://www.irobot.com/us/learn/defense/firstlook.aspx.
    [32]
    Israelski, E., & Lund, A. M. (2003). The human-computer interaction handbook. In J. A. Jacko & A. Sears (Eds.), (pp. 772--789). Hillsdale, NJ: L. Erlbaum Associates Inc. Retrived from http://dl.acm.org/citation.cfm?id=772072.772121.
    [33]
    Jang, I., & Park, W. (2004). A gesture-based control for handheld devices using accelerometer. In Proceedings of the 9th Iberoamerican Congress on Pattern Recognition (Vol. 3287, p. 259--266).
    [34]
    Jokela, T., Iivari, N., Matero, J., & Karukka, M. (2003). The standard of user-centered design and the standard definition of usability: Analyzing ISO 13407 against iso 9241-11. In Proceedings of the Latin American Conference on Human-Computer Interaction (p. 53--60). New York, NY: ACM.
    [35]
    Keyes, B., & Yanco, H. A. (2006). Camera placement and multi-camera fusion for remote robot operation. In IEEE International Workshop on Safety, Security and Rescue Robotics.
    [36]
    Khan, K., & Hyunwoo, K. (2009). Factors affecting consumer resistance to innovation: A study of smartphones (Unpublished master's thesis). Jonkoping International Business School, Jonkoping, Sweden.
    [37]
    Knox, M. (2012, December). Interview with CPT Michael Knox, U.S. Army. E-mail Correspondence.
    [38]
    Lane, J. C., Carignan, C. R., Sullivan, B. R., Akin, D. L., Hunt, T., & Cohen, R. (2002). Effects of time delay on telerobotic control of neutral buoyancy vehicles. In Proceedings of the 2002 IEEE International Conference on Robotics and Automation.
    [39]
    Lankenau, A. (2001). Avoiding mode confusion in service robots - The bremen autonomous wheelchair as an example. In Proceedings of the 7th International Conference on Rehabilitation Robotics (ICORR 2001) (p. 162--167). Evry, France.
    [40]
    Ling, C., Hwong, W., & Savendy, G. (2007, March-April). A survey of what customers want. Behaviour and Information Technology, 26(2), 149--163
    [41]
    Loper, M., Koenig, N., Chernova, S., Jones, C., & Jenkins, O. (2009). Mobile human-robot teaming with environmental tolerance. In Proceedings of the 4th ACM/IEEE International Conference on Human-Robot Interaction (p. 157--164).
    [42]
    McGovern, D. E. (1987). Experiences in teleoperation of land vehicles. Technical Report. Sandia National Labs: Albuquerque, NM.
    [43]
    McHale, J. (2014). Smartphones on the battlefield. Retrieved from http://mil-embedded.com/articles/smartphones-the-battlefield/.
    [44]
    Milian, M. (2011, July). U.S. Army may soon equip troops with smartphones. Retrieved from http://www.cnn.com/2011/TECH/mobile/07/12/army.smartphones/index.html?hpt=hp_t2.
    [45]
    Morrow, J. (2012, December). Interview with CPT Jack Morrow, U.S. Army. E-mail Correspondence.
    [46]
    Murphy, R. R. (2000). Introduction to AI Robotics. Cambridge, MA: The MIT Press.
    [47]
    Oakley, I., & O'Modhrain, S. (n.d.). Tilt to scroll: Evaluating a motion based vibrotactile mobile interface. In Proceedings of the Eurohaptics Conference, 2005 and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2005. World Haptics 2005. First Joint (p. 40--49).
    [48]
    Oakley, I., & Park, J. (2007). A motion-based marking menu system. In Proceedings of CHI '07 Extended Abstracts on Human Factors in Computing Systems (pp. 2597--2602). New York, NY: ACM.
    [49]
    Okafor, A. (2010, July). Parade of Rain Blog. Retrieved from http://www.paradeofrain.com/2010/07/lessons-learned-in-tilt-controls/.
    [50]
    Orbotix. (2012). iOS developer quick start guide. ReadMe in open source SDK. Retrieved from https://github.com/orbotix/Sphero-iOS-SDK.
    [51]
    Pettitt, R. A., Redden, E. S., Fung, N., Carstens, C. B., & Baran, D. (2011, September). Scalability of robotic controllers: An evaluation of controller options--Experiment ii. Technical Report No. 5776.
    [52]
    Piskorski, S., Brulez, N., & Eline, P. (2011, May). AR.Drone developer guide SDK 1.7. Retrieved from https://projects.ardrone.org.
    [53]
    Pitman, D. (2010). Collaborative micro aerial vehicle exploration of outdoor environments (Unpublished master's thesis). Massachusetts Institute of Technology, Cambridge, MA.
    [54]
    Pitman, D., & Cummings, M. L. (n.d.). Collaborative exploration with a micro aerial vehicle: A novel interaction method for controlling a MAV with a hand-held device. Advances in Human-Computer Interaction, 2012
    [55]
    Qua, A. (n.d.). Cube runner. Retrieved from http://andyqua.co.uk/CubeRunner/Welcome.html.
    [56]
    Rahman, M., Gustafson, S., Irani, P., & Subramanian, S. (2009). Tilt techniques: Investigating the desterity of wrist-based input. In Proceedings of the IEEE International Conference on Computer Human Interaction.
    [57]
    Ram, S., & Sheth, J. N. (1989). Consumer resistance to innovations: The marketing problem and its solutions. Journal of Consumer Marketing, 6(2), 5--14.
    [58]
    Recon Robotics. (2012, May). Throwbot XT - With audio. Retrieved from http://www.reconrobotics.com/products/Throwbot_XT_audio.cfm.
    [59]
    Redden, E. S., Elliott, L. R., Pettitt, R. A., & Carstens, C. B. (2011, June). Scaling robotic systems for dismounted warfighters. Journal of Cognitive Engineering and Decision Making, 5(2), 156--185.
    [60]
    Rekimoto, J. (1996). Tilting operations for small screen interfaces. In Proceedings of the 9th Annual ACM Symposium on User Interface Software and Technology.
    [61]
    Scalesse, C. (2013, January). Tutorial: Accelerometer calibration and optimizations. Retrieved from http://iphonedevsdk.com/forum/iphone-sdk-tutorials/39833-tutorial-accelerometer-calibration-optimizations.html.
    [62]
    Shneiderman, B., & Plaisant, C. (2010). Designing the user interface. Boston, MA: Addison-Wesley.
    [63]
    Tilbury, D., & Ulsoy, G. A. (2011). A new breed of robots that drive themselves. Mechanical Engineering, 133(2), 28.
    [64]
    Tomlinson, Z. A. (2009). Influence of spatial ability on primary and secondary space telerobotics operator performance. Aviation, Space, and Environmental Medicine, 80, 221.
    [65]
    Tracey, M. R., & Lathan, C. E. (2001). The interaction of spatial ability and motor learning in the transfer of training from a simulator to a real task. Studies in Health Technology and Informatics, 81, 521 -- 527.
    [66]
    University of Kent Careers and Employability Service. (2013). Non-Verbal Reasoning Test. Retrieved from http://www.kent.ac.uk/careers/tests/spatialtest.htm.
    [67]
    van Erp, J. B. (1999, April). Controlling unmanned vehicles: The human factors solution. In RTO SCI Symposium on "Warfare Automation: Procedures and Techniques for Unmanned Vehicles". Ankara, Turkey.
    [68]
    van Erp, J. B., & Padmos, P. (2003). Image parameters for driving with indirect viewing systems. Ergonomics, 46(15), 1471--1499.
    [69]
    Walker, A. M. (2013). Attitude-aware smartphones for teleoperated robot control (Unpublished doctoral dissertation). University of Oklahoma, Norman, OK.
    [70]
    Wang, J., Lewis, M., & Hughes, S. (2004). Gravity-referenced attitude display for teleoperation of mobile robots. In Proceedings of the Human Factors and Ergonomics Society 48th Annual Meeting.
    [71]
    Winfield, A. F. (2000). Future directions in tele-operated robotics. Technical Report. Bristol, UK: University of the West of England.
    [72]
    Yagoda, R. E., & Hill, S. G. (2011). Using mobile devices for robotic controllers: Examples and some initial concepts for experimentation (No. ARL-TN-436). Technical Report No. ARL-TN-436. Army Research Laboratory.
    [73]
    Yarrish, G. (2011, January). Look, ma! No radio! Retrieved from ModelAirplaneNews.com.

    Cited By

    View all
    • (2024)Charting User Experience in Physical Human–Robot InteractionACM Transactions on Human-Robot Interaction10.1145/365905813:2(1-29)Online publication date: 27-Apr-2024
    • (2021)Tactile Perception for Teleoperated Robotic Exploration within Granular MediaACM Transactions on Human-Robot Interaction10.1145/345999610:4(1-27)Online publication date: 14-Jul-2021
    • (2021)Mode Awareness Interfaces in Automated Vehicles, Robotics, and Aviation: A Literature Review13th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3409118.3475125(147-158)Online publication date: 9-Sep-2021

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image Journal of Human-Robot Interaction
    Journal of Human-Robot Interaction  Volume 4, Issue 1
    Special Issue on Haptics
    July 2015
    113 pages

    Publisher

    Journal of Human-Robot Interaction Steering Committee

    Publication History

    Published: 22 July 2015

    Author Tags

    1. field robotics
    2. handheld control
    3. human-robot interaction
    4. interface modality
    5. user-centered design

    Qualifiers

    • Research-article

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)101
    • Downloads (Last 6 weeks)9
    Reflects downloads up to 26 Jul 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Charting User Experience in Physical Human–Robot InteractionACM Transactions on Human-Robot Interaction10.1145/365905813:2(1-29)Online publication date: 27-Apr-2024
    • (2021)Tactile Perception for Teleoperated Robotic Exploration within Granular MediaACM Transactions on Human-Robot Interaction10.1145/345999610:4(1-27)Online publication date: 14-Jul-2021
    • (2021)Mode Awareness Interfaces in Automated Vehicles, Robotics, and Aviation: A Literature Review13th International Conference on Automotive User Interfaces and Interactive Vehicular Applications10.1145/3409118.3475125(147-158)Online publication date: 9-Sep-2021

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Get Access

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media