Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

A multimodal framework for the evaluation of patients’ weaknesses, supporting the design of customised AAL solutions

Published: 15 September 2022 Publication History
  • Get Citation Alerts
  • Abstract

    The recovery of motion abilities after a physical or neurological trauma is a long and winding road that is usually supported by medical staff. In particular, occupational therapists (OTs) play a fundamental role in assessing the performances of patients in daily life tasks and suggesting better practices and aids. The goal of OTs is to promote the patients’ abilities, fostering their independence with the minimum needed supervision. To this purpose, the possibility of remotely operating while being able to monitor the subject with a sufficient level of detail is highly desirable. The development of advanced tools for patient monitoring and supervision with limited intrusiveness is a scientific challenge that spans different technological areas: sensing, localisation, tracking, measurement, business intelligence. The final goal is to provide medical doctors with a set of meaningful metrics related to the patient’s activity. Among them, occupation of living spaces, motion patterns, posture, as well as fine-grained motion-related parameters (e.g., grasping objects, performing elementary tasks of variable difficulty), can be of great help for the medical staff to assess the degree of independence of the user and to recommend suitable assisting devices, more appropriate organisation of living spaces, effective rehabilitation procedures. In this work, we describe a sophisticated setup that has been developed in strict cooperation with medical experts and researchers in both engineering and medicine, to create an augmented reality physical environment supporting occupational therapists and rehabilitation staff in evaluating their patients’ performance and therapy activities. The proposed system is fully automatic and the users detection and tracking is completely markerless and vision-based. It comprises a set of realistic highly-infrastructured living environments, including sensing and actuation devices of different types, which are thoroughly described in Tables 1 and 2 The collected data are analysed by an automated expert system to extract the requested information, which is then presented inside the user interface. The underlying technological complexity is hidden to both patients and the medical staff, to guarantee minimum intrusiveness for the former and maximum operational easiness for the latter. During and after the patients staying inside the living spaces, the medical personnel is enabled to access a large variety of data through a unified interface and is provided with a pre-filled version of the standard patient card, a document that is otherwise filled in by hand. This allows both reducing the time requested for a medical diagnosis and producing a more objective and measurable assessment. A fully operating setup of our system is currently deployed within the living lab AUSILIA (Pisoni et al. 2016).

    Highlights

    Foster the patients’ independence during and after rehabilitation.
    Promote the patients’ abilities without any direct supervision.
    Operate remotely while monitoring the subject with a sufficient level of detail.
    Provide the medical staff with meaningful metrics related to the patient’s activity.

    References

    [1]
    Acampora G., Cook D.J., Rashidi P., Vasilakos A.V., A survey on ambient intelligence in healthcare, Proceedings of the IEEE 101 (12) (2013) 2470–2494.
    [2]
    Alemdar H., Ertan H., Incel O.D., Ersoy C., ARAS human activity datasets in multiple homes with multiple residents, in: 2013 7th International Conference on Pervasive Computing Technologies for Healthcare and Workshops, IEEE, 2013, pp. 232–235.
    [3]
    Anastasopoulos, M., Niebuhr, D., Bartelt, C., Koch, J., & Rausch, A. (2005). Towards a reference middleware architecture for ambient intelligence systems. In ACM Conference on Object-Oriented Programming, Systems, Languages, and Applications. San Diego, USA.
    [4]
    Bochkovskiy A., Wang C.-Y., Liao H.-Y.M., YOLOv4: Optimal speed and accuracy of object detection, 2020, arXiv preprint arXiv:2004.10934.
    [5]
    Bonato P., Wearable sensors and systems, IEEE Engineering in Medicine and Biology Magazine 29 (3) (2010) 25–36.
    [6]
    Butaslac I.I., Luchetti A., Parolin E., Fujimoto Y., Kanbara M., De Cecco M., et al., The feasibility of augmented reality as a support tool for motor rehabilitation, in: International Conference on Augmented Reality, Virtual Reality and Computer Graphics, Springer, 2020, pp. 165–173.
    [7]
    Cao Z., Simon T., Wei S., Sheikh Y., Realtime multi-person 2D pose estimation using part affinity fields, in: 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2017, pp. 1302–1310.
    [8]
    Chen J., Cremer J., Zarei K., Segre A., Polgreen P., Using computer vision and depth sensing to measure healthcare worker-patient contacts and personal protective equipment adherence within hospital rooms, Open Forum Infectious Diseases 3 (2015) ofv200.
    [9]
    Chen S.L., Lee H.-Y., Chen C.-A., Huang H.-Y., Luo C.-H., Wireless body sensor network with adaptive low-power design for biometrics and healthcare applications, Systems Journal, IEEE 3 (2010) 398–409.
    [10]
    Chen J., Zhang J., Kam A., Shue L., An automatic acoustic bathroom monitoring system, 2005, pp. 1750–1753.
    [11]
    Clarke P., Marshall V., Black S.E., Colantonio A., Well-being after stroke in Canadian seniors: findings from the Canadian study of health and aging, Stroke 33 (4) (2002) 1016–1021.
    [12]
    Comas-Herrera A., Wittenberg R., Costa-Font J., Gori C., Di Maio A., Patxot C., et al., Future long-term care expenditure in Germany, Spain, Italy and the United Kingdom, Ageing & Society 26 (2) (2006) 285–302.
    [13]
    Davoudi A., Malhotra K.R., Shickel B., Siegel S., Williams S., Ruppert M., et al., Intelligent ICU for autonomous patient monitoring using pervasive sensing and deep learning, Scientific Reports 9 (1) (2019) 8020.
    [14]
    Di Rienzo M., Rizzo F., Parati G., Brambilla G., Ferratini M., Castiglioni P., MagIC system: a new textile-based wearable device for biological signal monitoring. Applicability in daily life and clinical setting, in: Conference Proceedings : ... Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Conference, 2005, pp. 7167–7169.
    [15]
    Faber G., Kingma I., Chang C.-C., Dennerlein J., Van Dieen J., Validation of a wearable system for 3D ambulatory L5/S1 moment assessment during manual lifting using instrumented shoes and an inertial sensor suit, Journal of Biomechanics 102 (2020).
    [16]
    Fuglsang L., Hansen A.V., Mergel I., Røhnebæk M.T., Living labs for public sector innovation: An integrative literature review, Administrative Sciences 11 (2) (2021) 58.
    [17]
    Garau N., De Natale F.G.B., Conci N., Fast automatic camera network calibration through human mesh recovery, Journal of Real-Time Image Processing (2020).
    [18]
    Harmo P., Taipalus T., Knuuttila J., Vallet J., Halme A., Needs and solutions - home automation and service robots for the elderly and disabled, 2005, pp. 3201–3206.
    [19]
    Hassan, M., Choutas, V., Tzionas, D., & Black, M. J. (2019). Resolving 3D human pose ambiguities with 3D scene constraints. In Proceedings of the IEEE International Conference on Computer Vision (pp. 2282–2292).
    [20]
    Kocabas, M., Athanasiou, N., & Black, M. J. (2020). VIBE: Video inference for human body pose and shape estimation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (pp. 5253–5263).
    [21]
    Koch L.v., Wottrich A.W., Holmqvist L.W., Rehabilitation in the home versus the hospital: the importance of context, Disability and Rehabilitation 20 (10) (1998) 367–372.
    [22]
    Lewnard J.A., Lo N.C., Scientific and ethical basis for social-distancing interventions against COVID-19, The Lancet. Infectious Diseases 20 (6) (2020) 631.
    [23]
    López N., Ponce S., Piccinini D., Berenguer E., Roberti M., From hospital to home care: Creating a domotic environment for elderly and disabled people, IEEE Pulse 7 (2016) 38–41.
    [24]
    Ma A.J., Rawat N., Reiter A., Shrock C., Zhan A., Stone A., et al., Measuring patient mobility in the ICU using a novel noninvasive sensor, Critical Care Medicine 45 (4) (2017) 630–636.
    [25]
    Maetzler W., Domingos J., Srulijes K., Ferreira J., Bloem B., Quantitative wearable sensors for objective assessment of parkinson’s disease, Movement Disorders : Official Journal of the Movement Disorder Society 28 (2013).
    [26]
    Ortenzi D., Benetazzo F., Ferracuti F., Freddi A., Giantomassi A., Iarlori S., et al., AAL technologies for independent life of elderly people, 11, 2015.
    [27]
    Pacelli M., Loriga G., Taccini N., Paradiso R., Sensing fabrics for monitoring physiological and biomechanical variables: E-textile solutions, in: 2006 3rd IEEE/EMBS International Summer School on Medical Devices and Biosensors, 2006, pp. 1–4.
    [28]
    Pisoni T., Conci N., De Natale F.G., De Cecco M., Nollo G., Frattari A., et al., AUSilia: assisted unit for simulating independent living activities, in: 2016 IEEE International Smart Cities Conference (ISC2), IEEE, 2016, pp. 1–4.
    [29]
    Sebastiani, M., Garau, N., De Natale, F., & Conci, N. (2019). Joint Trajectory and Fatigue Analysis in Wheelchair Users. In Proceedings of the IEEE International Conference on Computer Vision Workshops.
    [30]
    Spasova V., Iliev I., A survey on automatic fall detection in the context of ambient assisted living systems, International Journal of Advanced Computer Research 4 (2014) 94.
    [31]
    Stocco M., Luchetti A., Tomasin P., Fornaser A., Ianes P., Guandalini G., et al., Augmented reality to enhance the clinical eye: The improvement of adl evaluation by mean of a sensors based observation, in: International Conference on Virtual Reality and Augmented Reality, Springer, 2019, pp. 291–296.

    Index Terms

    1. A multimodal framework for the evaluation of patients’ weaknesses, supporting the design of customised AAL solutions
            Index terms have been assigned to the content through auto-classification.

            Recommendations

            Comments

            Information & Contributors

            Information

            Published In

            cover image Expert Systems with Applications: An International Journal
            Expert Systems with Applications: An International Journal  Volume 202, Issue C
            Sep 2022
            1548 pages

            Publisher

            Pergamon Press, Inc.

            United States

            Publication History

            Published: 15 September 2022

            Author Tags

            1. Expert systems
            2. Computer vision
            3. Real-time
            4. Ambient-assisted living

            Qualifiers

            • Research-article

            Contributors

            Other Metrics

            Bibliometrics & Citations

            Bibliometrics

            Article Metrics

            • 0
              Total Citations
            • 0
              Total Downloads
            • Downloads (Last 12 months)0
            • Downloads (Last 6 weeks)0
            Reflects downloads up to 27 Jul 2024

            Other Metrics

            Citations

            View Options

            View options

            Get Access

            Login options

            Media

            Figures

            Other

            Tables

            Share

            Share

            Share this Publication link

            Share on social media