Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

MINA: A Multitasking Intelligent Nurse Aid Robot

Harish Ram Nambiappan, The University of Texas at Arlington, United States, harishram.nambiappan@gmail.com
Krishna Chaitanya Kodur, The University of Texas at Arlington, United States, kck8298@mavs.uta.edu
Maria Kyrarini, The University of Texas at Arlington, United States, maria.kyrarini@uta.edu
Fillia Makedon, The University of Texas at Arlington, United States, makedon@uta.edu
Nicholas Gans, The University of Texas at Arlington, United States, nick.gans@uta.edu

Nurses have to carry out a myriad of tasks, which is one of the main reasons for burnout. In this paper, a robotic Multitasking Intelligent Nurse Aid (MINA) is proposed to assist nurses with everyday tasks and tackle nurse burnout. MINA uses Simultaneous Localization And Mapping (SLAM) to map shelves in a storage room and pointcloud using an RGB-Depth camera. The barcode of the item is first detected using zbar library, and then the pointcloud is generated in ROS and saved accordingly. Preliminary experiments with respect to the creation of a pointcloud map with barcode detection are presented. The preliminary results reveal that the system shows a 100% barcode detection rate, and with each barcode detection a corresponding pointcloud map is generated and saved automatically without any intervention.

CCS Concepts:Computer systems organization → Robotic autonomy;

Keywords: nursing robots, service robots, hospital, intelligent systems

ACM Reference Format:
Harish Ram Nambiappan, Krishna Chaitanya Kodur, Maria Kyrarini, Fillia Makedon, and Nicholas Gans. 2021. MINA: A Multitasking Intelligent Nurse Aid Robot. In The 14th PErvasive Technologies Related to Assistive Environments Conference (PETRA 2021), June 29-July 2, 2021, Corfu, Greece. ACM, New York, NY, USA 2 Pages. https://doi.org/10.1145/3453892.3461010

1 INTRODUCTION

Nurses play vital roles in healthcare and make up the largest portion of health professionals. Nurses provide and coordinate patient care and educate patients and the public about health conditions. The US Bureau of Labor Statistics projects that the employment of nurses is estimated to grow 12% from 2018–2028 [1]. Growth will occur for a number of reasons, including primary and preventive care, growing rates of chronic conditions, and healthcare services demand for an aging population. Safe, high-quality healthcare is connected to the physical and mental wellbeing of healthcare personnel, including nurses [3]. A study conducted on 53,846 nurses from six countries (including the U.S.) showed a relationship between nurse burnout and ratings of care quality [8]. Nurses are overburdened, stressed, work long hours, often suffer from lack of sleep, poor diet, and a feeling that they have to do everything themselves [2]. Especially in the current COVID-19 pandemic, the nurse workload increased and nurses are under stress and pressure due to the need to provide intensive care to COVID-19 patients and longer working hours [6]. On the other hand, hospital management and other healthcare facilities are obligated and expected to ensure the safety of their patients and minimize medical errors, which is the third leading cause of death in the US [7]. Enhancing the future work of nurses is vital to ensuring the quality of healthcare quality and the nation's economic and social well-being. Pervasive technologies offer the potential to improve the work performances of tomorrow's nurses in multiple ways, including improving the productivity, efficacy, occupational safety, and quality of nurse work life [5].

2 PROPOSED SYSTEM

We envision the future of work for nurses to be enhanced by a Multitasking Intelligent Nurse Aid (MINA) robotic system. In this work, the focus is on using the MINA robotic system in a medical supply scenario where the robot will be used for fetching and providing medical items. The whole function of the robot with respect to the medical supply scenario can be divided into the following steps; (1) navigation towards the supply shelf once the robot receives a request, (2) fetching the required items from the supply shelf and (3) supplying the fetched items back to the required destination. The functions of the robotic system are implemented through Robot Operation System (ROS). For the navigation of MINA, we have focused especially on the navigation of the mobile base of the integrated robot. With respect to fetching of the items, the focus is on creating point cloud of an experimental supply shelf mapped with barcode and object locations which is used for picking the required items from the shelf. We use the Simultaneous Localization and Mapping (SLAM) [4] to create a pointcloud map of the supply shelf with barcode information and locations of the objects mapped in it. The pointcloud map is created using an RGB - Depth (RGB-D) camera. The concept of SLAM is mostly used for mapping of surrounding environment for the purpose of navigation. However, in this work we develop a novel approach by creating a pointcloud map along with the objects location and barcode information of a static shelf. Figure 1 a illustrates the first version of the MINA system, which consists of the following parts; (I) the omni-directional mobile base Ridgeback developed by Clearpath, to enable navigation through complex hospital environments. (II) the robotic arm Panda, developed by Franka Emika, to enable manipulation of several objects, and (III) cameras and laser scanners to enable the robotic platform to “see” and track supplies.

3 EXPERIMENTAL RESULTS

Experimental Setup: For the preliminary experiment, the Intel Realsense D-435i RGB-D Camera was used for the purpose of detection and recognition of barcodes and creation of the pointcloud map. EAN-13 barcodes were used for this experiment to replicate the type of barcodes widely used in medical supply shelves. The camera is integrated with ROS to create the pointcloud map. A supply shelf with barcodes that replicates a medicinal supply shelf was setup and the camera was placed in front of the shelf to create the pointcloud map as shown in Figure 1 b. The barcodes are placed right below the items. The camera is moved towards the item and the pointcloud is generated and saved once the barcode is detected using zbar library. The experiment was repeated 5 times for each of the 15 different EAN-13 barcodes.

Evaluation Metrics: The Barcode Detection Rate (BDR) is calculated based on how many times the system detected the barcode correctly when the camera is positioned in front of the barcode. The Barcode Recognition Rate (BRR) is based on whether the system identified the barcode number properly after its detection. The Pointcloud Generation Rate (PGR) is based on whether the system generates and saves the pointcloud once the barcode is detected. The following equation shows how the rates are calculated:

\begin{equation} \\ BDR = \frac{N_b}{N_c} * 100 \%; BRR = \frac{N_r}{N_b} * 100 \%; PGR = \frac{N_p}{N_b} * 100 \% \end{equation}
(1)

where Nb: number of times the system detected the barcode, Nc: number of times the camera is positioned in front of the barcode, Nr: number of times the system recognized the correct barcode number and Np: number of times the pointcloud is generated.

Preliminary Experiment Results: Experiments conducted in the basic supply shelf scenario revealed that the system was able to detect barcodes and simultaneously produce the pointcloud map surrounding the detected area. The system produced a 100% BDR, 90% BRR and 100% PGR. The reason behind a 90% barcode recognition rate is that, although the system detected the barcode every time, the system at times recognized it as a different barcode number. The direction and angle at which the camera was brought towards the barcode each time also has an effect in the recognition of barcode number by the system. Each and every time the system detected a barcode, the corresponding pointcloud map was built and saved accordingly. Figure 1 c shows a resultant pointcloud map produced once the barcode is detected. In the current version of the system, when a barcode is detected the area surrounding the particular barcode is saved in the pointcloud map. The pointcloud map shown in Figure 1 c is produced and saved when 2 barcodes are detected (the top 2 barcodes shown in Figure 1 b). The system is currently set up in such a way that the pointcloud map is only saved once the barcode is detected. The results show that the system is able to automatically build and save pointcloud map once detection of the barcode occurs without any user intervention.

Figure 1
Figure 1: (a) The MINA Robot (b) A first version of supply shelf with realsense camera (c) A pointcloud map produced during barcode detection. (Blue lines: reference lines that show the direction at which the camera is being placed.).

4 CONCLUSION

In this paper, MINA is proposed as an intelligent assistant to the nurses. The main focus is how to use MINA as a command and fetch robot for fetching medical supplies. The robot uses SLAM for navigation and pointcloud of the surroundings with an RGB-D camera to find the barcodes’ items. This paper is a work in progress. The preliminary results show that the system has a 90% barcode recognition rate and 100% barcode detection rate. For each barcode detection, the system produces the corresponding pointcloud map and saves it automatically without any user intervention. In the future, this system is to be developed further by implementing the current process along with object fetch functionality and navigation based on user requests.

REFERENCES

  • 2019. Occupational Employment and Wages. https://www.bls.gov/oes/2018/may/oes291141.htm
  • Claire C Caruso. 2014. Negative impacts of shiftwork and long work hours. Rehabilitation Nursing 39, 1 (2014), 16–25.
  • Louise H Hall, Judith Johnson, Ian Watt, Anastasia Tsipa, and Daryl B O'Connor. 2016. Healthcare staff wellbeing, burnout, and patient safety: a systematic review. PloS one 11, 7 (2016), e0159015.
  • IntelRealSense. [n.d.]. SLAM with D435i. https://github.com/IntelRealSense/realsense-ros/wiki/SLAM-with-D435i
  • Maria Kyrarini, Fotios Lygerakis, Akilesh Rajavenkatanarayanan, Christos Sevastopoulos, Harish Ram Nambiappan, Kodur Krishna Chaitanya, Ashwin Ramesh Babu, Joanne Mathew, and Fillia Makedon. 2021. A Survey of Robots in Healthcare. Technologies 9, 1 (2021), 8.
  • Alberto Lucchini, Pasquale Iozzo, and Stefano Bambi. 2020. Nursing workload in the COVID-19 ERA. Intensive & Critical Care Nursing(2020).
  • Martin A Makary and Michael Daniel. 2016. Medical error—the third leading cause of death in the US. Bmj 353(2016).
  • Lusine Poghosyan, Sean P Clarke, Mary Finlayson, and Linda H Aiken. 2010. Nurse burnout and quality of care: Cross-national investigation in six countries. Research in nursing & health 33, 4 (2010), 288–298.

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from permissions@acm.org.

PETRA 2021, June 29–July 02, 2021, Corfu, Greece

© 2021 Association for Computing Machinery.
ACM ISBN 978-1-4503-8792-7/21/07…$15.00.
DOI: https://doi.org/10.1145/3453892.3461010