... In this paper we have presented an overview of the collaboration capabilities of QuickSet, a ... more ... In this paper we have presented an overview of the collaboration capabilities of QuickSet, a handheld multimodal, agent-based architecture for map interaction, the lessons we have learned while building it, and plans for future handheld and groupware capabilities. ...
... In this paper we have presented an overview of the collaboration capabilities of QuickSet, a ... more ... In this paper we have presented an overview of the collaboration capabilities of QuickSet, a handheld multimodal, agent-based architecture for map interaction, the lessons we have learned while building it, and plans for future handheld and groupware capabilities. ...
... In this paper we have presented an overview of the collaboration capabilities of QuickSet, a ... more ... In this paper we have presented an overview of the collaboration capabilities of QuickSet, a handheld multimodal, agent-based architecture for map interaction, the lessons we have learned while building it, and plans for future handheld and groupware capabilities. ...
Commanders and their staff, when geographically dispersed, continue to choose what they find to b... more Commanders and their staff, when geographically dispersed, continue to choose what they find to be the most effective remote collaboration tools: email and radio messages. Even command and control systems that provide a unified view of the battlefield among remote collaborators are often supplanted by more traditional tools, such as a large paper map (McGee, Cohen & Wu, 2000). Indeed, today's computer interfaces often impose too high a barrier for the capture and delivery of situational assessment (McGee, Cohen, Wesson & Horman, 2002). To compensate, commanders traditionally meet face-to-face at least once daily to debrief each other on the outcome of the day's fight and to coordinate a strategy for the next day's engagement. Each of these meetings presents a risk to commanders, in addition to lost time during travel and various other concerns. Ongoing activity at higher echelons may consist of this activity continuously. At lower echelons, coordination is typically medi...
This paper presents an emerging application of multimodal interface research to distributed appli... more This paper presents an emerging application of multimodal interface research to distributed applications. We have developed the QuickSet prototype, a pen/voice system running on a hand-held PC, communicating via wireless LAN through an agent architecture to a number of systems, including NRaD's LeatherNet system, a distributed interactive training simulator built for the US Marine Corps. The paper describes the overall system architecture, a novel multimodal integration strategy offering mutual compensation among modalities, and provides examples of multimodal simulation setup. Finally, we discuss our applications experience and evaluation.
In this paper we describe how we have enhanced our multimodal paper-based system, Rasa, with visu... more In this paper we describe how we have enhanced our multimodal paper-based system, Rasa, with visual perceptual input. We briefly explain how Rasa improves upon current decisionsupport tools by augmenting, rather than replacing, the paperbased tools that people in command and control centers have come to rely upon. We note shortcomings in our initial approach, discuss how we have added computer-vision as another input modality in our multimodal fusion system, and characterize the advantages that it has to offer. We conclude by discussing our current limitations and the work we intend to pursue to overcome them in the future.
Philip R. Cohen, Michael Johnston, David McGee, Sharon Oviatt, Jay Pittman, Ira Smith, Liang Chen... more Philip R. Cohen, Michael Johnston, David McGee, Sharon Oviatt, Jay Pittman, Ira Smith, Liang Chen ... and can be visualized in a wall-sized virtual reality CAVE environment [Cruz-Neira et al ... Cheyer and Julia [1995] sketch a system based on Oviatt's [ 19961 results and the Open ...
3. QUICKSET To address these simulation interface problems, we have developed QuickSet (see Figur... more 3. QUICKSET To address these simulation interface problems, we have developed QuickSet (see Figure 2) a collaborative, handheld, multimodal system for configuring military simulations based on LeatherNet [Clarkson and Yi, 1996], a system used in training platoon leaders ...
ABSTRACT This paper presents a novel multimodal system applied to the setup and control of distri... more ABSTRACT This paper presents a novel multimodal system applied to the setup and control of distributed interactive simulations.
Proceedings of the 35th Annual Meeting of the Association For Computational Linguistics and Eighth Conference of the European Chapter of the Association For Computational Linguistics, Jul 7, 1997
... In this paper we have presented an overview of the collaboration capabilities of QuickSet, a ... more ... In this paper we have presented an overview of the collaboration capabilities of QuickSet, a handheld multimodal, agent-based architecture for map interaction, the lessons we have learned while building it, and plans for future handheld and groupware capabilities. ...
... In this paper we have presented an overview of the collaboration capabilities of QuickSet, a ... more ... In this paper we have presented an overview of the collaboration capabilities of QuickSet, a handheld multimodal, agent-based architecture for map interaction, the lessons we have learned while building it, and plans for future handheld and groupware capabilities. ...
... In this paper we have presented an overview of the collaboration capabilities of QuickSet, a ... more ... In this paper we have presented an overview of the collaboration capabilities of QuickSet, a handheld multimodal, agent-based architecture for map interaction, the lessons we have learned while building it, and plans for future handheld and groupware capabilities. ...
Commanders and their staff, when geographically dispersed, continue to choose what they find to b... more Commanders and their staff, when geographically dispersed, continue to choose what they find to be the most effective remote collaboration tools: email and radio messages. Even command and control systems that provide a unified view of the battlefield among remote collaborators are often supplanted by more traditional tools, such as a large paper map (McGee, Cohen & Wu, 2000). Indeed, today's computer interfaces often impose too high a barrier for the capture and delivery of situational assessment (McGee, Cohen, Wesson & Horman, 2002). To compensate, commanders traditionally meet face-to-face at least once daily to debrief each other on the outcome of the day's fight and to coordinate a strategy for the next day's engagement. Each of these meetings presents a risk to commanders, in addition to lost time during travel and various other concerns. Ongoing activity at higher echelons may consist of this activity continuously. At lower echelons, coordination is typically medi...
This paper presents an emerging application of multimodal interface research to distributed appli... more This paper presents an emerging application of multimodal interface research to distributed applications. We have developed the QuickSet prototype, a pen/voice system running on a hand-held PC, communicating via wireless LAN through an agent architecture to a number of systems, including NRaD's LeatherNet system, a distributed interactive training simulator built for the US Marine Corps. The paper describes the overall system architecture, a novel multimodal integration strategy offering mutual compensation among modalities, and provides examples of multimodal simulation setup. Finally, we discuss our applications experience and evaluation.
In this paper we describe how we have enhanced our multimodal paper-based system, Rasa, with visu... more In this paper we describe how we have enhanced our multimodal paper-based system, Rasa, with visual perceptual input. We briefly explain how Rasa improves upon current decisionsupport tools by augmenting, rather than replacing, the paperbased tools that people in command and control centers have come to rely upon. We note shortcomings in our initial approach, discuss how we have added computer-vision as another input modality in our multimodal fusion system, and characterize the advantages that it has to offer. We conclude by discussing our current limitations and the work we intend to pursue to overcome them in the future.
Philip R. Cohen, Michael Johnston, David McGee, Sharon Oviatt, Jay Pittman, Ira Smith, Liang Chen... more Philip R. Cohen, Michael Johnston, David McGee, Sharon Oviatt, Jay Pittman, Ira Smith, Liang Chen ... and can be visualized in a wall-sized virtual reality CAVE environment [Cruz-Neira et al ... Cheyer and Julia [1995] sketch a system based on Oviatt's [ 19961 results and the Open ...
3. QUICKSET To address these simulation interface problems, we have developed QuickSet (see Figur... more 3. QUICKSET To address these simulation interface problems, we have developed QuickSet (see Figure 2) a collaborative, handheld, multimodal system for configuring military simulations based on LeatherNet [Clarkson and Yi, 1996], a system used in training platoon leaders ...
ABSTRACT This paper presents a novel multimodal system applied to the setup and control of distri... more ABSTRACT This paper presents a novel multimodal system applied to the setup and control of distributed interactive simulations.
Proceedings of the 35th Annual Meeting of the Association For Computational Linguistics and Eighth Conference of the European Chapter of the Association For Computational Linguistics, Jul 7, 1997
Uploads
Papers by David McGee