Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Award Abstract # 1327657
NRI: Large: Collaborative Research: Complementary Situational Awareness for Human-Robot Partnerships

NSF Org: IIS
Div Of Information & Intelligent Systems
Recipient: THE JOHNS HOPKINS UNIVERSITY
Initial Amendment Date: September 9, 2013
Latest Amendment Date: April 17, 2015
Award Number: 1327657
Award Instrument: Continuing Grant
Program Manager: Wendy Nilsen
wnilsen@nsf.gov
 (703)292-2568
IIS
 Div Of Information & Intelligent Systems
CSE
 Direct For Computer & Info Scie & Enginr
Start Date: October 1, 2013
End Date: September 30, 2020 (Estimated)
Total Intended Award Amount: $1,175,809.00
Total Awarded Amount to Date: $1,228,609.00
Funds Obligated to Date: FY 2013 = $561,787.00
FY 2014 = $614,022.00

FY 2015 = $52,800.00
History of Investigator:
  • Russell Taylor (Principal Investigator)
    rht@cs.jhu.edu
Recipient Sponsored Research Office: Johns Hopkins University
3400 N CHARLES ST
BALTIMORE
MD  US  21218-2608
(443)997-1898
Sponsor Congressional District: 07
Primary Place of Performance: Johns Hopkins University
3400 N. Charles St.
Baltimore
MD  US  21218-2683
Primary Place of Performance
Congressional District:
07
Unique Entity Identifier (UEI): FTMTDMBR29C7
Parent UEI:
NSF Program(s): International Research Collab,
IIS Special Projects,
NRI-National Robotics Initiati
Primary Program Source: 01001314DB NSF RESEARCH & RELATED ACTIVIT
01001415DB NSF RESEARCH & RELATED ACTIVIT

01001516DB NSF RESEARCH & RELATED ACTIVIT
Program Reference Code(s): 5946, 7925, 8086, 9251
Program Element Code(s): 729800, 748400, 801300
Award Agency Code: 4900
Fund Agency Code: 4900
Assistance Listing Number(s): 47.070

ABSTRACT

This work will advance human-robot partnerships by establishing a new concept called complementary situational awareness (CSA), which is the simultaneous perception and use of the environment and operational constraints for task execution. The proposed CSA is transformative because it ushers in a new era of human-robot partnerships where robots act as our partners, not only in manipulation, but in perception and control. This research will establish the foundations for CSA to enable multifaceted human-robot partnerships. Three main research objectives guide this effort: 1) Real-time Sensing during Task Execution: design low-level control algorithms providing wire-actuated or flexible continuum robots with sensory awareness by supporting force sensing, exploration, and modulated force interaction in flexible unstructured environments; 2) Situational Awareness Modeling: prescribe information fusion and simultaneous localization and mapping (SLAM) algorithms suitable for surgical planning and in-vivo surgical plan adaptation; 3) Telemanipulation based on CSA: Design, construct, and integrate robotic testbeds with telemanipulation algorithms that use SLAM and exploration data for online adaptation of assistive telemanipulation virtual fixtures. This research also includes investigation of previously unaddressed questions on how sensory exploration and palpation data can be used to enable online-adaptation of assistive virtual fixtures based on force and stiffness data while also taking into account preoperative data and intraoperative correction of registration parameters.

The proposed work will restore the situational awareness readily available in open surgery to minimally invasive surgery. This will benefit patients by enabling core technologies for effective and safe natural orifice surgery or single port access surgery. The societal impact of the proposed work on these two surgical paradigms is reduced pain for patients, shorter hospital stay, improved cosmesis and patients' self image, and lower costs. We also believe that CSA will impact manufacturing where its future will require people and robots working together in a shared space on collaborative tasks. Also, the same concepts of CSA apply to telemanipulation in constrained and unstructured environments and the proposed research has direct relevance to robot-human partnerships for space exploration. To ensure this broader impact will be achieved, an advisory board has been assembled with experts from medicine, manufacturing and aerospace. Finally, the PIs will facilitate collaboration in the medical robotics research community by making our software and hardware designs available on-line and using commercial-grade hardware available at multiple institutions.

PUBLICATIONS PRODUCED AS A RESULT OF THIS RESEARCH

Note:  When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

(Showing: 1 - 10 of 64)
Andrea Bajo, Nabil Simaan "Hybrid Motion/Force Control of Multi-Backbone Continuum Robots" International Journal of Robotics Research , v.35 , 2016 , p.422 10.1177/0278364915584806
P. Chalasani, L. Wang, R. Roy, N. Simaan, R. H. Taylor, and M. Kobilarov "Concurrent Nonparametric Estimation of Organ Geometry and Tissue Stiffness Using Continuous Adaptive Palpation" EEE Conference on Robotics and Automation , 2016 , p.4164-4171
Andrea Bajo and Nabil Simaan "Hybrid Motion/Force Control of {MultiBackbone} Continuum Robots" International Journal of Robotics Research. 35 (4) , v.35 , 2016
Ayvali, E. and Srivatsan, A. and Wang, L. and Roy, R. and Simaan, N. and Choset, H. "Using Bayesian Optimizationto Guide Probing of a Flexible Environment for Simultaneous Registration and Stiffness Mapping" IEEE International Conference on Robotics & Automation ({ICRA}?2016) , v.931. , 2016
Chalasani, P. and Wang, L. and Roy, R. and Simaan, N. and Taylor, R. H. and Kobilarov, M. "Concurrent Nonparametric Estimation of Organ Geometry and Tissue Stiffness Using Continuous Adaptive Palpation" IEEE International Conference on Robotics & Automation ({ICRA}?2016) , v.4164. , 2016
Rangaprasad Arun Srivatsan and Gillian T. Rosen and D. Feroze Naina Mohamed and Howie Choset "Estimating {SE}(3) elements using a dual quaternion based linear {Kalman} filter" Robotics: Science and Systems {XII}.. Status = {PUBLISHED}; Acknowledgment of Federal Support = Yes ; Peer Reviewed = Yes ; {DOI}: , v.10. , 2016
Roy, R. and Wang, L. and Simaan, N. "Investigation of effects of dynamics on intrinsic wrench sensing in continuum robots" IEEE International Conference on Robotics & Automation ({ICRA}?2015). 2052. Status = {PUBLISHED}; Acknowledgment of Federal Support = Yes ; Peer Reviewed = Yes ; {DOI}: , v.10. , 2016
L. Wang, Z. Chen, P. Chalasani, J. Pile, P. Kazanzides, R. H. Taylor, and N. Simaan "Updating Virtual Fixtures from Exploration Data in Force-Controlled Model-Based Telemanipulation" ASME 2016 International Design Engineering Technical Conferences & Computers and Information in Engineering Conference , 2016
Z. Chen, A. Malpani, P. Chalasani, A. Deguet, P. Kazanzides, and R. H. Taylor "Virtual Fixture Assistance for Needle Passing and Knotting Tying" IEEE/RSJ International Conference on Intelligent Robots and Systems , 2016 , p.2343-50 10.1109/IROS.2016.7759365
Arun Srivatsan Rangaprasad and Howie Choset "Multiple Start Branch and Prune Filtering Algorithm for Nonconvex Optimization" The 12th International Workshop on The Algorithmic Foundations of Robotics , 2016
Arun Srivatsan Rangaprasad and Mengyun Xu and Nicolas Zevallos and Howie Choset "Bingham Distribution-Based Linear Filter for Online Pose Estimation" Robotics: Science and Systems, 2017 , 2017
(Showing: 1 - 10 of 64)

PROJECT OUTCOMES REPORT

Disclaimer

This Project Outcomes Report for the General Public is displayed verbatim as submitted by the Principal Investigator (PI) for this award. Any opinions, findings, and conclusions or recommendations expressed in this Report are those of the PI and do not necessarily reflect the views of the National Science Foundation; NSF has not approved or endorsed its content.

When operators telemanipulate robotic devices, they are hampered by perception barriers challenging their situational awareness. A surgeon telemanipulating a surgical robot is limited in their understanding of the surgical scene and of the robot's interaction with the anatomy. The robotics research community has dealt with these challenges by focusing on ways of providing force feedback to the surgeons and by providing assistive control laws (called virtual fixtures) that superimpose a safety barrier or help the surgeon follow a desired path while avoiding critical anatomy. These solutions are limited by the reliance on medical image registration (a process by which pre-operative anatomy images are related to the intraoperative scene). Moreover, these solutions relied predominantly on pre-operative geometric definitions of a surgical plan to construct the assistive virtual fixtures.

Figure 1 shows the approach followed in this collaborative research along with some scenarios where the surgical perception is lacking. According to our approach, the robot is used for manipulation augmentation and for perception augmentation by fusing  intraoperative sensory data and imaging (e.g. tissue stiffness, computer vision) with preoperative models and images of the anatomy.  Figures 1b and 1c show scenarios where such perception and situational awareness augmentation would be critical for safe operation. In both images, the robot can have multiple contacts outside the surgeon's visual field of view and there is a need for a robot that can discern such contacts and a high-level controller that uses this information to adapt the telemanipulation behaviors to enable safe operation.  

The concept of robot situational awareness was investigated as part of a paradigm in which intraoperative sensory information was used to inform the update of a surgical plan and its corresponding virtual fixtures. In addition to using geometry, the use of force-controlled palpation and exploration of the anatomy has been explored and demonstrated to allow adaptive surgical plans. Figure 2 shows a robot using force-controlled scan of a mock organ to account for organ deformation relative to a pre-operative model of the organ. Using such intraoperative information, advanced statistical methods were used to take advantage of intraoperative sensing and preoperative information to improve the computer?s model of the patient?s anatomy and the surgical plan. Figure 3 shows a result of force-controlled exploration where an organ stiffness map is generated and used with geometry to inform the process of updating the model of the anatomy. Figure 4 shows steps in an efficient method for real-time stiffness mapping during telemanipulation of the robot. The method allows an interactive rate annotation of the anatomy model with stiffness information which could be used and in figure 3 for identifying possible locations of tumors or for identifying a hidden artery as in figure 4.  

These tools enabled a rigorous exploration of new hybrid assistive telemanipulation frameworks that allow the robot high-level controller to specify behaviors where the robot controls motion or regulates force while allowing the user to telemanipulate the robot tip for remote palpation. Figure 5 shows a subset of these conditions. Automated and semi-automated telemanipulation with superimposed end-effector excitations have been developed to allow the high level controller to discern information tantamount to that obtained during palpation. Ways of relaying this information to users have also been explored through a user study with the aim of determining the potential benefits of these approaches. 

We also focused on sensing using continuum robots to assist with surgical perception. To achieve this, a new approach to model force and motion transmission losses. It was shown that the high level controller can use these modeling techniques to allow assistive behaviors of palpation to support force-controlled virtual fixture model update and to enable regulation of force while carrying out tasks of ablation and knot tying. Figure 6 shows one of our robotic platforms used to test our new sensing and control approaches. The figure shows a new approach for hybrid force/motion control using estimation of tip forces and two sample use scenarios (force regulated knot tying and ablation). 

These contributions will facilitate future development of human-robot cooperative systems with applications for surgery, space robotics, search and rescue and potentially robot-worker collaboration in manufacturing.

Other broader impacts of this award supported the training of two post-docs, eight Ph.D. students, and 6 undergraduate students. Thirty-nine archival publications were presented in national and international conferences and journals. Four Ph.D. dissertations were published. In addition, 37 high school female students received STEM and robotics training in three winter classes with each class spanning three weeks. Seven Ph.D.s and two postdocs trained on this program have joined industry research groups in medical robotics and research in human-robot collaboration. One of the Ph.D. students trained on this award started a tenure-track faculty position in the U.S.. 

The project public page is http://nri-csa.vuse.vanderbilt.edu/joomla/ where there are also public data sets and computer code related to this project.


Last Modified: 12/22/2020
Modified by: Russell H Taylor

Please report errors in award information by writing to: awardsearch@nsf.gov.

Print this page

Back to Top of page