Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
 
 
Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (42)

Search Parameters:
Keywords = child–robot interaction

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
15 pages, 286 KiB  
Article
Intelligent Agents at School—Child–Robot Interactions as an Educational Path
by Margherita Di Stasio and Beatrice Miotti
Educ. Sci. 2024, 14(7), 774; https://doi.org/10.3390/educsci14070774 - 16 Jul 2024
Viewed by 421
Abstract
The pervasiveness of technologies leads us to talk about a code society. From an educational point of view, coding, computational thinking, and educational robotics are an open possibility. Nevertheless, new elements such as artificial intelligence are rapidly changing educational technology perspectives. In this [...] Read more.
The pervasiveness of technologies leads us to talk about a code society. From an educational point of view, coding, computational thinking, and educational robotics are an open possibility. Nevertheless, new elements such as artificial intelligence are rapidly changing educational technology perspectives. In this work, we will analyze school policies and theoretical bases in order to understand if, and under what kind of, condition coding, computational thinking, and educational robotics still represent the qualifying elements of a framework for digital literacy and digital citizenship. Full article
Show Figures

Figure 1

25 pages, 112880 KiB  
Article
Anthropomorphic Robotic Hand Prosthesis Developed for Children
by Pablo Medina-Coello, Blas Salvador-Domínguez, Francisco J. Badesa, José María Rodríguez Corral, Henrik Plastrotmann and Arturo Morgado-Estévez
Biomimetics 2024, 9(7), 401; https://doi.org/10.3390/biomimetics9070401 - 2 Jul 2024
Viewed by 836
Abstract
The use of both hands is a common practice in everyday life. The capacity to interact with the environment is largely dependent on the ability to use both hands. A thorough review of the current state of the art reveals that commercially available [...] Read more.
The use of both hands is a common practice in everyday life. The capacity to interact with the environment is largely dependent on the ability to use both hands. A thorough review of the current state of the art reveals that commercially available prosthetic hands designed for children are very different in functionality from those developed for adults, primarily due to prosthetic hands for adults featuring a greater number of actuated joints. Many times, patients stop using their prosthetic device because they feel that it does not fit well in terms of shape and size. With the idea of solving these problems, the design of HandBot-Kid has been developed with the anthropomorphic qualities of a child between the ages of eight and twelve in mind. Fitting the features of this age range, the robotic hand has a length of 16 cm, width of 7 cm, thickness of 3.6 cm, and weight of 328 g. The prosthesis is equipped with a total of fifteen degrees of freedom (DOF), with three DOFs allocated to each finger. The concept of design for manufacturing and assembly (DFMA) has been integrated into the development process, enabling the number of parts to be optimized in order to reduce the production time and cost. The utilization of 3D printing technology in conjunction with aluminum machining enabled the manufacturing process of the robotic hand prototype to be streamlined. The flexion–extension movement of each finger exhibits a trajectory that is highly similar to that of a real human finger. The four-bar mechanism integrated into the finger design achieves a mechanical advantage (MA) of 40.33% and a fingertip pressure force of 10.23 N. Finally, HandBot-Kid was subjected to a series of studies and taxonomical tests, including Cutkosky (16 points) and Kapandji (4 points) score tests, and the functional results were compared with some commercial solutions for children mentioned in the state of the art. Full article
Show Figures

Graphical abstract

17 pages, 280 KiB  
Article
Reluctance to Authenticity-Imbued Social Robots as Child-Interaction Partners
by Andreja Istenič, Liliya Latypova, Violeta Rosanda, Žiga Turk, Roza Valeeva and Xuesong Zhai
Educ. Sci. 2024, 14(4), 390; https://doi.org/10.3390/educsci14040390 - 9 Apr 2024
Viewed by 1234
Abstract
We are facing the rapid development of educational technology and social robots tested in classrooms. Research has identified teachers’ caution and concerns about these robots’ social skills. Pre-service education is critical for forming beliefs and preparing teachers for the future classroom and innovations [...] Read more.
We are facing the rapid development of educational technology and social robots tested in classrooms. Research has identified teachers’ caution and concerns about these robots’ social skills. Pre-service education is critical for forming beliefs and preparing teachers for the future classroom and innovations in educational technology. In the present study, exploratory factor analysis is applied to examine pre-service teachers’ concerns about social robots’ instructional integration in the role of social agents interacting with children. We apply a concerns scale encompassing the instructional and socio-emotional concerns regarding robots’ instructional integration in the classroom environment. In this study, the scale, which was developed in Slovenia, is examined in the Russian cultural context. Based on the concerns scale, exploratory factor analysis identifies a one-factor solution with five statements (of a six-item factor) shared with the Slovene sample, adding three statements focusing on the importance of the teacher’s role. Russian pre-service teachers share concerns with Slovene pre-service teachers and further highlight the authenticity of unique human relationships and interactions. Slovenian pre-service teachers are more focused on children’s social skills and well-being, while Russian participants give special attention to the teacher’s role and value and believe that it would be wrong to place the robot in a classroom for such a purpose. They do not consider the robot’s human-like interaction skills sufficient for it to be assigned the role of a social agent and interaction partner in the classroom. The inappropriateness of the robot for pedagogical interactions and relationships is the basis of all their concerns. The Kruskal–Wallis test identified the moderate magnitude of the difference between the groups (ε2 = 0.07–0.12), with Russian pre-service teachers presenting the strongest reluctance towards authenticity-imbued social robots in pedagogical roles. The authors emphasize the need to clearly state stakeholders (roboticists, teachers, children, parents) in the research design and their roles in the evaluation of robot implementation. Full article
(This article belongs to the Section Technology Enhanced Education)
21 pages, 946 KiB  
Systematic Review
Roles and Effect of Digital Technology on Young Children’s STEM Education: A Scoping Review of Empirical Studies
by Xinyun Hu, Yuan Fang and Yutong Liang
Educ. Sci. 2024, 14(4), 357; https://doi.org/10.3390/educsci14040357 - 28 Mar 2024
Viewed by 2033
Abstract
Digital technology is increasingly used in STEM education for young children aged 0–8 years. An extensive literature search was conducted using seven databases to systematically investigate the effect of digital technology on young children’s STEM education. Twenty-two eligible articles published from 2010 to [...] Read more.
Digital technology is increasingly used in STEM education for young children aged 0–8 years. An extensive literature search was conducted using seven databases to systematically investigate the effect of digital technology on young children’s STEM education. Twenty-two eligible articles published from 2010 to 2021 were identified. Results showed that robotics, programming, and multimedia were used to support young children’s STEM education. Digital technology plays different roles in the process of STEM education. Outcomes also showed that digital technology positively affected young children’s STEM education in terms of STEM knowledge or skill acquisition and learning engagement. This was regardless of gender but relevant to age and the learning condition. Participating children and teachers reported high acceptance and satisfaction with the included programs. However, many difficulties, challenges and criticisms were revealed by the extracted data, including how digital technology is used in young children’s STEM education, the nature of young children, the requirements placed upon educators, and different types of adult–child interactions. We also look at the limitations of the study design within included studies and provide recommendations accordingly. Full article
Show Figures

Figure 1

20 pages, 629 KiB  
Article
Lessons in Developing a Behavioral Coding Protocol to Analyze In-the-Wild Child–Robot Interaction Events and Experiments
by Xela Indurkhya and Gentiane Venture
Electronics 2024, 13(7), 1175; https://doi.org/10.3390/electronics13071175 - 22 Mar 2024
Viewed by 935
Abstract
Behavioral analyses of in-the-wild HRI studies generally rely on interviews or visual information from videos. This can be very limiting in settings where video recordings are not allowed or limited. We designed and tested a vocalization-based protocol to analyze in-the-wild child–robot interactions based [...] Read more.
Behavioral analyses of in-the-wild HRI studies generally rely on interviews or visual information from videos. This can be very limiting in settings where video recordings are not allowed or limited. We designed and tested a vocalization-based protocol to analyze in-the-wild child–robot interactions based upon a behavioral coding scheme utilized in wildlife biology, specifically in studies of wild dolphin populations. The audio of a video or audio recording is converted into a transcript, which is then analyzed using a behavioral coding protocol consisting of 5–6 categories (one indicating non-robot-related behavior, and 4–5 categories of robot-related behavior). Refining the code categories and training coders resulted in increased agreement between coders, but only to a level of moderate reliability, leading to our recommendation that it be used with three coders to assess where there is majority consensus, and thereby correct for subjectivity. We discuss lessons learned in the design and implementation of this protocol and the potential for future child–robot experiments analyzed through vocalization behavior. We also perform a few observational behavior analyses from vocalizations alone to demonstrate the potential of this field. Full article
Show Figures

Figure 1

13 pages, 4644 KiB  
Article
Voice-Controlled Robotics in Early Education: Implementing and Validating Child-Directed Interactions Using a Collaborative Robot and Artificial Intelligence
by Cristhian A. Aguilera, Angela Castro, Cristhian Aguilera and Bogdan Raducanu
Appl. Sci. 2024, 14(6), 2408; https://doi.org/10.3390/app14062408 - 13 Mar 2024
Viewed by 1141
Abstract
This article introduces a voice-controlled robotic system for early education, enabling children as young as four to interact with robots using natural voice commands. Recognizing the challenges posed by programming languages and robot theory for young learners, this study leverages recent advancements in [...] Read more.
This article introduces a voice-controlled robotic system for early education, enabling children as young as four to interact with robots using natural voice commands. Recognizing the challenges posed by programming languages and robot theory for young learners, this study leverages recent advancements in artificial intelligence, such as large language models, to make robots more intelligent and easier to use. This innovative approach fosters a natural and intuitive interaction between the child and the robot, effectively removing barriers to access and expanding the educational possibilities of robotics in the classroom. In this context, a software pipeline is proposed that translates voice commands into robot actions. Each component is tested using different deep learning models and cloud services to determine their suitability, with the best ones being selected. Finally, the chosen setup is validated through an integration test involving children aged 4 to 6 years. Preliminary results demonstrate the system’s capability to accurately recognize and execute voice commands, highlighting its potential as a valuable educational tool for early education. Full article
(This article belongs to the Section Robotics and Automation)
Show Figures

Figure 1

13 pages, 261 KiB  
Article
Perspectives of Healthcare Providers to Inform the Design of an AI-Enhanced Social Robot in the Pediatric Emergency Department
by Summer Hudson, Fareha Nishat, Jennifer Stinson, Sasha Litwin, Frauke Zeller, Brittany Wiles, Mary Ellen Foster and Samina Ali
Children 2023, 10(9), 1511; https://doi.org/10.3390/children10091511 - 6 Sep 2023
Cited by 5 | Viewed by 1848
Abstract
Children commonly experience pain and distress in healthcare settings related to medical procedures such as blood tests and intravenous insertions (IVIs). Inadequately addressed pain and distress can result in both short- and long-term negative consequences. The use of socially assistive robotics (SARs) to [...] Read more.
Children commonly experience pain and distress in healthcare settings related to medical procedures such as blood tests and intravenous insertions (IVIs). Inadequately addressed pain and distress can result in both short- and long-term negative consequences. The use of socially assistive robotics (SARs) to reduce procedure-related distress and pain in children’s healthcare settings has shown promise; however, the current options lack autonomous adaptability. This study presents a descriptive qualitative needs assessment of healthcare providers (HCPs) in two Canadian pediatric emergency departments (ED) to inform the design an artificial intelligence (AI)-enhanced social robot to be used as a distraction tool in the ED to facilitate IVIs. Semi-structured virtual individual and focus group interviews were conducted with eleven HCPs. Four main themes were identified: (1) common challenges during IVIs (i.e., child distress and resource limitations), (2) available tools for pain and distress management during IVIs (i.e., pharmacological and non-pharmacological), (3) response to SAR appearance and functionality (i.e., personalized emotional support, adaptive distraction based on child’s preferences, and positive reinforcement), and (4) anticipated benefits and challenges of SAR in the ED (i.e., ensuring developmentally appropriate interactions and space limitations). HCPs perceive AI-enhanced social robots as a promising tool for distraction during IVIs in the ED. Full article
(This article belongs to the Section Pediatric Emergency Medicine & Intensive Care Medicine)
14 pages, 9716 KiB  
Article
Can You Dance? A Study of Child–Robot Interaction and Emotional Response Using the NAO Robot
by Vid Podpečan
Multimodal Technol. Interact. 2023, 7(9), 85; https://doi.org/10.3390/mti7090085 - 30 Aug 2023
Cited by 3 | Viewed by 1888
Abstract
This retrospective study presents and summarizes our long-term efforts in the popularization of robotics, engineering, and artificial intelligence (STEM) using the NAO humanoid robot. By a conservative estimate, over a span of 8 years, we engaged at least a couple of thousand participants: [...] Read more.
This retrospective study presents and summarizes our long-term efforts in the popularization of robotics, engineering, and artificial intelligence (STEM) using the NAO humanoid robot. By a conservative estimate, over a span of 8 years, we engaged at least a couple of thousand participants: approximately 70% were preschool children, 15% were elementary school students, and 15% were teenagers and adults. We describe several robot applications that were developed specifically for this task and assess their qualitative performance outside a controlled research setting, catering to various demographics, including those with special needs (ASD, ADHD). Five groups of applications are presented: (1) motor development activities and games, (2) children’s games, (3) theatrical performances, (4) artificial intelligence applications, and (5) data harvesting applications. Different cases of human–robot interactions are considered and evaluated according to our experience, and we discuss their weak points and potential improvements. We examine the response of the audience when confronted with a humanoid robot featuring intelligent behavior, such as conversational intelligence and emotion recognition. We consider the importance of the robot’s physical appearance, the emotional dynamics of human–robot engagement across age groups, the relevance of non-verbal cues, and analyze drawings crafted by preschool children both before and after their interaction with the NAO robot. Full article
Show Figures

Figure 1

16 pages, 4203 KiB  
Article
Measuring Engagement in Robot-Assisted Therapy for Autistic Children
by Abeer Al-Nafjan, Noura Alhakbani and Amal Alabdulkareem
Behav. Sci. 2023, 13(8), 618; https://doi.org/10.3390/bs13080618 - 25 Jul 2023
Cited by 4 | Viewed by 1926
Abstract
Children with autism face a range of challenges when it comes to verbal and nonverbal communication. It is essential that children participate in a variety of social, educational, and therapeutic activities to acquire knowledge that is essential for cognitive and social development. Recent [...] Read more.
Children with autism face a range of challenges when it comes to verbal and nonverbal communication. It is essential that children participate in a variety of social, educational, and therapeutic activities to acquire knowledge that is essential for cognitive and social development. Recent studies have shown that children with autism may be interested in playing with an interactive robot. The robot can engage these children in ways that demonstrate and train essential aspects of human interaction, guiding them in therapeutic sessions to practice more complex forms of interaction found in social human-to-human interactions. This study sets out to investigate Robot-Assisted Autism Therapy (RAAT) and the use of artificial intelligence (AI) approaches for measuring the engagement of children during therapy sessions. The study population consisted of five native Arabic-speaking autistic children aged between 4 and 11 years old. The child–robot interaction was recorded by the robot camera and later used for analysis to detect engagement. The results show that the proposed system offers some accuracy in measuring the engagement of children with ASD. Our findings revealed that robot-assisted therapy is a promising field of application for intelligent social robots, especially to support autistic children in achieving their therapeutic and educational objectives. Full article
(This article belongs to the Special Issue Training and Education in Children with Autism)
Show Figures

Figure 1

16 pages, 2118 KiB  
Article
In-the-Wild Affect Analysis of Children with ASD Using Heart Rate
by Kamran Ali, Sachin Shah and Charles E. Hughes
Sensors 2023, 23(14), 6572; https://doi.org/10.3390/s23146572 - 21 Jul 2023
Cited by 1 | Viewed by 1413
Abstract
Recognizing the affective state of children with autism spectrum disorder (ASD) in real-world settings poses challenges due to the varying head poses, illumination levels, occlusion and a lack of datasets annotated with emotions in in-the-wild scenarios. Understanding the emotional state of children with [...] Read more.
Recognizing the affective state of children with autism spectrum disorder (ASD) in real-world settings poses challenges due to the varying head poses, illumination levels, occlusion and a lack of datasets annotated with emotions in in-the-wild scenarios. Understanding the emotional state of children with ASD is crucial for providing personalized interventions and support. Existing methods often rely on controlled lab environments, limiting their applicability to real-world scenarios. Hence, a framework that enables the recognition of affective states in children with ASD in uncontrolled settings is needed. This paper presents a framework for recognizing the affective state of children with ASD in an in-the-wild setting using heart rate (HR) information. More specifically, an algorithm is developed that can classify a participant’s emotion as positive, negative, or neutral by analyzing the heart rate signal acquired from a smartwatch. The heart rate data are obtained in real time using a smartwatch application while the child learns to code a robot and interacts with an avatar. The avatar assists the child in developing communication skills and programming the robot. In this paper, we also present a semi-automated annotation technique based on facial expression recognition for the heart rate data. The HR signal is analyzed to extract features that capture the emotional state of the child. Additionally, in this paper, the performance of a raw HR-signal-based emotion classification algorithm is compared with a classification approach based on features extracted from HR signals using discrete wavelet transform (DWT). The experimental results demonstrate that the proposed method achieves comparable performance to state-of-the-art HR-based emotion recognition techniques, despite being conducted in an uncontrolled setting rather than a controlled lab environment. The framework presented in this paper contributes to the real-world affect analysis of children with ASD using HR information. By enabling emotion recognition in uncontrolled settings, this approach has the potential to improve the monitoring and understanding of the emotional well-being of children with ASD in their daily lives. Full article
Show Figures

Figure 1

13 pages, 4197 KiB  
Article
Heart Rate as a Predictor of Challenging Behaviours among Children with Autism from Wearable Sensors in Social Robot Interactions
by Ahmad Qadeib Alban, Ahmad Yaser Alhaddad, Abdulaziz Al-Ali, Wing-Chee So, Olcay Connor, Malek Ayesh, Uvais Ahmed Qidwai and John-John Cabibihan
Robotics 2023, 12(2), 55; https://doi.org/10.3390/robotics12020055 - 1 Apr 2023
Cited by 8 | Viewed by 2780
Abstract
Children with autism face challenges in various skills (e.g., communication and social) and they exhibit challenging behaviours. These challenging behaviours represent a challenge to their families, therapists, and caregivers, especially during therapy sessions. In this study, we have investigated several machine learning techniques [...] Read more.
Children with autism face challenges in various skills (e.g., communication and social) and they exhibit challenging behaviours. These challenging behaviours represent a challenge to their families, therapists, and caregivers, especially during therapy sessions. In this study, we have investigated several machine learning techniques and data modalities acquired using wearable sensors from children with autism during their interactions with social robots and toys in their potential to detect challenging behaviours. Each child wore a wearable device that collected data. Video annotations of the sessions were used to identify the occurrence of challenging behaviours. Extracted time features (i.e., mean, standard deviation, min, and max) in conjunction with four machine learning techniques were considered to detect challenging behaviors. The heart rate variability (HRV) changes have also been investigated in this study. The XGBoost algorithm has achieved the best performance (i.e., an accuracy of 99%). Additionally, physiological features outperformed the kinetic ones, with the heart rate being the main contributing feature in the prediction performance. One HRV parameter (i.e., RMSSD) was found to correlate with the occurrence of challenging behaviours. This work highlights the importance of developing the tools and methods to detect challenging behaviors among children with autism during aided sessions with social robots. Full article
(This article belongs to the Section Humanoid and Human Robotics)
Show Figures

Figure 1

29 pages, 10303 KiB  
Article
Child–Robot Interactions Using Educational Robots: An Ethical and Inclusive Perspective
by Marta I. Tarrés-Puertas, Vicent Costa, Montserrat Pedreira Alvarez, Gabriel Lemkow-Tovias, Josep M. Rossell and Antonio D. Dorado
Sensors 2023, 23(3), 1675; https://doi.org/10.3390/s23031675 - 3 Feb 2023
Cited by 3 | Viewed by 3168
Abstract
The Qui-Bot H2O project involves developing four educational sustainable robots and their associated software. Robots are equipped with HRI features such as voice recognition and color sensing, and they possess a humanoid appearance. The project highlights the social and ethical aspects [...] Read more.
The Qui-Bot H2O project involves developing four educational sustainable robots and their associated software. Robots are equipped with HRI features such as voice recognition and color sensing, and they possess a humanoid appearance. The project highlights the social and ethical aspects of robotics applied to chemistry and industry 4.0 at an early age. Here, we report the results of an interactive study that involved 212 students aged within the range of 3–18. Our educational robots were used to measure the backgrounds, impact, and interest of students, as well as their satisfaction after interacting with them. Additionally, we provide an ethical study of the use of these robots in the classroom and a comparison of the interactions of humanoid versus non-humanoid educational robots observed in early childhood learning. Our findings demonstrate that these robots are useful in teaching technical and scientific concepts in a playful and intuitive manner, as well as in increasing the number of girls who are interested in science and engineering careers. In addition, major impact measures generated by the project within a year of its implementation were analyzed. Several public administrations in the area of gender equality endorsed and participated in the Qui-Bot H2O project in addition to educational and business entities. Full article
Show Figures

Figure 1

18 pages, 4295 KiB  
Article
Deep Learning-Based Cost-Effective and Responsive Robot for Autism Treatment
by Aditya Singh, Kislay Raj, Teerath Kumar, Swapnil Verma and Arunabha M. Roy
Drones 2023, 7(2), 81; https://doi.org/10.3390/drones7020081 - 23 Jan 2023
Cited by 65 | Viewed by 5567
Abstract
Recent studies state that, for a person with autism spectrum disorder, learning and improvement is often seen in environments where technological tools are involved. A robot is an excellent tool to be used in therapy and teaching. It can transform teaching methods, not [...] Read more.
Recent studies state that, for a person with autism spectrum disorder, learning and improvement is often seen in environments where technological tools are involved. A robot is an excellent tool to be used in therapy and teaching. It can transform teaching methods, not just in the classrooms but also in the in-house clinical practices. With the rapid advancement in deep learning techniques, robots became more capable of handling human behaviour. In this paper, we present a cost-efficient, socially designed robot called ‘Tinku’, developed to assist in teaching special needs children. ‘Tinku’ is low cost but is full of features and has the ability to produce human-like expressions. Its design is inspired by the widely accepted animated character ‘WALL-E’. Its capabilities include offline speech processing and computer vision—we used light object detection models, such as Yolo v3-tiny and single shot detector (SSD)—for obstacle avoidance, non-verbal communication, expressing emotions in an anthropomorphic way, etc. It uses an onboard deep learning technique to localize the objects in the scene and uses the information for semantic perception. We have developed several lessons for training using these features. A sample lesson about brushing is discussed to show the robot’s capabilities. Tinku is cute, and loaded with lots of features, and the management of all the processes is mind-blowing. It is developed in the supervision of clinical experts and its condition for application is taken care of. A small survey on the appearance is also discussed. More importantly, it is tested on small children for the acceptance of the technology and compatibility in terms of voice interaction. It helps autistic kids using state-of-the-art deep learning models. Autism Spectral disorders are being increasingly identified today’s world. The studies show that children are prone to interact with technology more comfortably than a with human instructor. To fulfil this demand, we presented a cost-effective solution in the form of a robot with some common lessons for the training of an autism-affected child. Full article
(This article belongs to the Topic Artificial Intelligence in Sensors)
Show Figures

Figure 1

18 pages, 6857 KiB  
Article
Drawing Interpretation Using Neural Networks and Accessibility Implementation in Mobile Application
by Aura-Loredana Popescu and Nirvana Popescu
Computation 2022, 10(11), 202; https://doi.org/10.3390/computation10110202 - 17 Nov 2022
Cited by 1 | Viewed by 2235
Abstract
This paper continues the research of the previous work, regarding PandaSays mobile application, having its main purpose to detect the affective state of the child from his drawings, using MobileNet neural network. Children diagnosed with autism spectrum disorder, have difficulties in expressing their [...] Read more.
This paper continues the research of the previous work, regarding PandaSays mobile application, having its main purpose to detect the affective state of the child from his drawings, using MobileNet neural network. Children diagnosed with autism spectrum disorder, have difficulties in expressing their feelings and communicating with others. The purpose of PandaSays mobile application, is to help parents and tutors that have children diagnosed with autism, to communicate better with them and to understand their feelings. The main goal was to improve the model’s accuracy, trained with MobileNet neural network, which reached the value of 84.583%. For training the model, it was used Python programming language. The study focuses further on accessibility and its importance to children diagnosed with autism. Relevant screenshots of the mobile application are presented, in order to indicate that the application follows the accessibility guidelines and rules. Finally, there is presented the interaction with Marty robot and the efficiency of mobile application’s drawing prediction. Full article
Show Figures

Figure 1

16 pages, 1630 KiB  
Article
The Use of Social Robots in the Diagnosis of Autism in Preschool Children
by Krzysztof Arent, David J. Brown, Joanna Kruk-Lasocka, Tomasz Lukasz Niemiec, Aleksandra Helena Pasieczna, Penny J. Standen and Remigiusz Szczepanowski
Appl. Sci. 2022, 12(17), 8399; https://doi.org/10.3390/app12178399 - 23 Aug 2022
Cited by 8 | Viewed by 2220
Abstract
The present study contributes to the research problem of applying social robots in autism diagnosis. There is a common belief that existing diagnostic methods for autistic spectrum disorder are not effective. Advances in Human–Robot Interactions (HRI) provide potential new diagnostic methods based on [...] Read more.
The present study contributes to the research problem of applying social robots in autism diagnosis. There is a common belief that existing diagnostic methods for autistic spectrum disorder are not effective. Advances in Human–Robot Interactions (HRI) provide potential new diagnostic methods based on interactive robots. We investigated deficits in turn-taking in preschool children by observing their interactions with the NAO robot during two games: (Dance with me vs. Touch me). We compared children’s interaction profiles with the robot (five autistic vs. five typically developing young children). Then, to investigate turn-taking deficits, we adopted a rating procedure to indicate differences between both groups of children based on an observational scale. A statistical analysis based on ratings of the children’s interactions with the NAO robot indicated that autistic children presented a deficient level of turn-taking behaviors. Our study provides evidence for the potential of designing and implementing an interactive dyadic game between a child and a social robot that can be used to detect turn-taking deficits based on objective measures. We also discuss our results in the context of existing studies and propose guidelines for a robotic-enabled autism diagnosis system. Full article
(This article belongs to the Special Issue Automation Control and Robotics in Human-Machine Cooperation)
Show Figures

Figure 1

Back to TopTop