Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Smart Manufacturing Review Fin 240705 123026

Download as pdf or txt
Download as pdf or txt
You are on page 1of 69

Advancing Human Detection Technology in Industrial

Manufacturing for Enhanced Human-Robot Collaboration: A


Comprehensive Analysis within the Industry 4.0 Framework
Usman Ahmad Usmania , Suliana Sulaimana , Abdullah Yousuf Usmanib , Muhamad Hariza ,
Junzo Watadac
a
Faculty of Computing and Meta Technology, Universiti Pendidikan Sultan Idris, 35900 Tanjong Malim,
Perak
b
Mechanical Engineering, Aligarh Muslim University, Aligarh, Uttar Pradesh 202001, India
c
Computer Science, Waseda University, 1-chōme-104 Totsukamachi, Shinjuku City, Tokyo 169-8050, Japan

Abstract
Human detection technology in industrial manufacturing for Human Robot Collaboration
(HRC) is pivotal for safety by preventing accidents, optimizing efficiency, and ensuring
compliance with regulations. It fosters worker confidence, cost savings, and technological
progress, making it a cornerstone of modern industrial practices. The current shortcom-
ings in include accuracy and reliability issues, false alarms, adaptability challenges, and
cost barriers, which need to be addressed to enhance safety and efficiency. This paper
provides a detailed examination of the integration of human operators into the framework
of Industry 4.0, where advanced automated manufacturing systems redefine their roles. It
conducts a comprehensive analysis of HRC and Interaction HRI within the Industry 4.0
context, highlighting the pivotal significance of 3D spatial awareness for fostering efficient
and secure collaboration between humans and robots in shared workspaces. The paper
extensively explores various technologies and methodologies for human detection, encom-
passing both established approaches and emerging techniques. It unveils the multifaceted
challenges and exciting opportunities entailed in integrating humans into industrial settings,
offering insights for future research and application. In particular, it focuses on the poten-
tial of emerging technologies to reshape the landscape of human integration in advanced
robotics, ultimately contributing to the creation of safer and more efficient manufacturing
environments. This review aims to support the robotics community in advancing human
detection technology in industrial manufacturing for enhanced human-robot collaboration,
discuss identified literature gaps, and suggest future research directions in this area.
Keywords: Human Robot Collaboration (HRC), Industry 4.0 Integration, 3D Spatial
Awareness, Human Detection Technologies, Safety in Manufacturing

Email addresses: usman@meta.upsi.edu.my (Usman Ahmad Usmani), suliana@meta.upsi.edu.my


(Suliana Sulaiman), usmani.fts@gmail.com (Abdullah Yousuf Usmani), mhariz@meta.upsi.edu.my
(Muhamad Hariz), junzo@waseda.jp (Junzo Watada)
May 3, 2024
1. Introduction
The transformation of manufacturing over the past two centuries is a fascinating journey.
Initially, manufacturing was characterized by manual craftsmanship, where skilled artisans
meticulously crafted products by hand. However, as the industrial age dawned, automa-
tion gradually took center stage, introducing mechanical systems and processes that could
perform tasks with speed and precision. This shift laid the foundation for the subsequent
industrial revolutions. Today, we find ourselves immersed in the fourth industrial revolution,
aptly termed Industry 4.0 or smart manufacturing. At the heart of this revolution are Cyber
Physical Systems (CPSs). These are intricate networks of interconnected computational en-
tities that seamlessly bridge the gap between the physical and digital realms. Often referred
to as Cyber Physical Production Systems (CPPSs), these entities have become the backbone
of modern manufacturing [1-4]. Smart manufacturing is underpinned by several key enabling
technologies, each of which plays a pivotal role in driving its evolution. Digital twin technol-
ogy allows for the creation of digital replicas of physical assets, enhancing monitoring and
control [5]. The Internet of Things (IoT) connects devices and systems, enabling data-driven
decision-making. Advanced robotics and artificial intelligence (AI) bring automation and
intelligent decision-making to the forefront [5]. The adoption of these technologies within
Industry 4.0 holds the promise of transformative improvements in manufacturing efficiency
and the creation of innovative business models. By leveraging the power of digital twins,
IoT connectivity, advanced robotics, and AI, manufacturers aim to optimize their operations
and embrace new horizons in production and distribution [6]. This revolution is not only
about efficiency but also about fostering greater connectivity, both horizontally and verti-
cally, through comprehensive systems integration. Manufacturers are on a quest to create
more agile and responsive production ecosystems, enabling them to adapt to rapidly chang-
ing market demands [6]. However, as we journey further into the realm of Industry 4.0 with
advanced automated manufacturing systems, a complex challenge emerges — redefining the
role of human operators within this cyber-physical manufacturing landscape. In the previous
era, Industry 3.0, roles were relatively straightforward. Automated systems, often embodied
by robots, were assigned laborious, repetitive, and potentially hazardous tasks.
In contrast, human operators were tasked with managing more intricate responsibilities
that demanded human cognition and dexterity [7] [8]. Industrial robotic systems, while
instrumental in automating various tasks, introduced inherent risks to human workers. Their
rapid movements, high speeds, and substantial inertia posed serious dangers in terms of
collisions, crush injuries, and pinching accidents. Consequently, safeguards such as physical
barriers and sensing systems (e.g., light curtains, 2D Lidar scanners) became imperative to
ensure the safety of human workers [9]. These safety measures, although essential, added
costs and often reduced manufacturing flexibility. The advent of Industry 4.0 has ushered
in a revolution in the way robots are utilized in manufacturing. Fields like HRI and HRC
have gained prominence and are currently the focus of significant attention [10], [11]. For
humans and robots to collaborate seamlessly within the same workspace, it is imperative that
robots possess the ability to locate humans accurately in three-dimensional space [12]. This
spatial awareness is pivotal to ensure the safety and efficiency of human-robot interactions.

May 3, 2024
Leveraging smart manufacturing sensing technology, humans and industrial robots can now
safely share the same workspace, working alongside each other [10]. To enable efficient and
safe HRC systems, there is a pressing need for these systems to locally detect the presence of
humans within their vicinity. These human detection systems for HRC can range from simple
object detection using 2D Lidar to more advanced techniques encompassing human and pose
detection using RGB cameras [11]. Emerging technologies, including RGB-D sensors and
wearables, have also played pivotal roles in enhancing human detection capabilities [13].
As robotic systems continue to advance, and with the introduction of mobile robots
(mobots), a growing need arises to understand how humans can be seamlessly integrated
into the manufacturing process on a larger, factory-wide scale. The review contributes to
Industry 4.0 by addressing the critical challenge of redefining human roles in a cyber-physical
manufacturing landscape, emphasizing the importance of seamless HRC for efficiency and
safety. It explores the role of advanced human detection systems, such as RGB-D sensors
and wearables, in enabling HRC, facilitating the integration of humans into manufacturing
processes on a larger, factory-wide scale. This review is essential as it guides the industry
in leveraging emerging technologies to optimize operations, foster connectivity, and adapt
to changing market demands within the fourth industrial revolution. The paper makes
three significant contributions within the context of the manufacturing evolution and the
challenges posed by Industry 4.0:
1. Comprehensive Exploration of Industry 4.0 Integration: The paper exten-
sively explores human integration in Industry 4.0, providing a detailed overview of the core
principles and technologies in smart manufacturing. It dissects the associated challenges and
opportunities, offering valuable insights for researchers, policymakers, and industry profes-
sionals navigating this transformative era.
2. In-Depth Analysis of HRC: The paper’s central contribution lies in its comprehen-
sive analysis of HRC and HRI within Industry 4.0, emphasizing the vital role of 3D spatial
awareness for safe human-robot collaboration. It explores diverse human detection technolo-
gies and safety measures, offering a practical roadmap for researchers and practitioners in
advanced manufacturing, enhancing the efficiency and safety of human-robot partnerships.
3. Insights for Future Research and Application: The paper goes beyond summa-
rizing existing knowledge by highlighting future research and application areas. It explores
emerging technologies like RGB-D sensors, wearables, and mobile robots, providing forward-
looking insights that guide scholars and industry leaders to remain innovative in the evolving
field of human integration in advanced robotics.
This comprehensive review article aims to delve into the intricate facets of integrating hu-
mans into industrial manufacturing environments, particularly in the context of Industry 4.0.
In Section 2, we delve into the details of the search methodology employed in this study.
Section 3 is dedicated to an in-depth exploration of the fundamental principles that underlie
human-robot collaboration. Section 4 critically examines the significance and multifaceted
challenges of involving humans in the evolving landscape of Industry 4.0. Section 5 takes
a deep dive into the technology and methodologies employed for human detection within
the context of smart manufacturing. Section 6 focuses on the concept of Human-centric
Manufacturing, examining its implications, applications, and its role in shaping the future
May 3, 2024
of industry. Section 7 engages in an extensive discussion of existing literature, highlighting
both the formidable challenges and exciting opportunities presented by the inclusion of hu-
mans in industrial settings. Finally, Section 8 discusses the insightful remarks and Section 9
summarizes the critical insights gained from this dynamic and ever-evolving field of research.

2. Search Methodology
In conducting our review titled "Human in the Loop: A Review of Human Detection
Technology in Industrial Manufacturing Environments for Human Robot Collaboration,"
we employed a rigorous search methodology to gather relevant literature and insights. The
review aimed to comprehensively assess the state of human detection technology in the
context of industrial manufacturing and its role in facilitating Human Robot Collaboration
(HRC). To commence our search for the review titled "Human in the Loop: A Review of
Human Detection Technology in Industrial Manufacturing Environments for Human Robot
Collaboration," we conducted a thorough search on prominent web-based databases, such
as Google Scholar, Scopus, and Web of Science. This search spanned the timeframe from
January 2009 to September 2020. We meticulously selected search terms that align with the
scope of our review, emphasizing the crucial elements of our investigation. These keywords
included "Human Detection Technology," "Industrial Manufacturing," "Human Robot Col-
laboration," and "Human-Robot Interaction in Manufacturing."
We maintained a focus on relevance, excluding literature that did not align with the
scope of our review. This involved excluding papers related to HRI in industrial processes,
economic evaluations or feasibility studies of disassembly lines, process optimization of disas-
sembly line designs, and the design of tools for HRC applications. Additionally, we excluded
papers that did not meet the criteria for relevance or did not contribute substantially to
the understanding of current human detection technology and its applications in industrial
manufacturing for HRC. Our meticulous screening process resulted in identifying nine publi-
cations that were deemed pertinent to the field of human-robot collaboration in disassembly.
By employing this robust search methodology, our review aims to provide a comprehensive
overview and analysis of autonomous robotic disassembly systems and robotic solutions
used in industrial disassembly tasks. We seek to shed light on the pivotal role of human
detection technology in facilitating HRC within the context of Industry 4.0 and address
the challenges and advancements in this critical area of manufacturing. Figure 1 outlines
the literature review selection process. It follows the PRISMA methodology, systematically
gathering relevant literature from databases like Google Scholar, Scopus, and Web of Science.
Search terms focused on "Human Detection Technology," "Industrial Manufacturing," and
"Human-Robot Collaboration." A rigorous screening process ensured the inclusion of perti-
nent articles, forming the basis for subsequent analysis. This process provides a transparent
and structured approach to understanding human detection technology’s role in industrial
manufacturing for Human-Robot Collaboration. Table 1 presents a summary of the review
methodology and findings focused on human detection technology in industrial manufac-
turing for human-robot collaboration. It outlines the systematic approach employed in the
review process, including the selection of databases, search terms, and inclusion criteria.
May 3, 2024
Figure 1: Literature Review Selection Process

Additionally, the table provides key findings and insights gleaned from the review, shedding
light on the state of human detection technology, its role in facilitating human-robot collabo-
ration, and the challenges and advancements observed in this critical area of manufacturing.

[Table 1 about here.]

3. Principles of Human–Robot Collaboration


The principles of HRC outline the fundamental guidelines and concepts that govern the
safe and effective interaction between humans and robots in various contexts, including
industrial manufacturing and beyond. Here are some key principles:
1. Safety First: Ensuring the safety of human operators is paramount in the context of
HRC. In industrial manufacturing environments, where humans and robots work together,
it becomes crucial to minimize the risk of collisions and accidents. [14][15]To achieve this,
robots must be equipped with advanced sensors and intelligent systems that can detect the
presence of humans in their proximity. These sensors enable real-time monitoring of the
May 3, 2024
surroundings, allowing robots to respond swiftly to any human presence, thereby preventing
potential hazards and ensuring a safe working environment for all stakeholders involved.
This critical aspect of safety underscores the significance of human detection technology in
the realm of HRC.
2. Risk Assessment: In the realm of HRC in industrial manufacturing, safeguarding
the well-being of human operators is of utmost importance. Rigorous risk assessments
are indispensable, as they unveil potential hazards and risks in the interaction between
humans and robots. To mitigate these risks effectively, safety measures must be put in
place. These measures range from implementing advanced sensors and automation to create
safety zones, barriers, and emergency stop systems. [16][17]Robust training, adherence to
standard operating procedures, and compliance with regulations further ensure a secure and
productive working environment. The conscientious implementation of these safety measures
not only minimizes risks but also optimizes the efficiency and success of HRC in industrial
settings.
3. Clear Communication: Establishing clear communication channels between hu-
mans and robots is paramount for effective collaboration in industrial settings. These chan-
nels encompass a range of communication modalities, including visual cues, audible warnings,
and feedback mechanisms. Visual cues can include status indicators or LED lights on robots
to signal their operational state or intention. Audible warnings, such as alarms or spoken
instructions, can alert human operators to potential risks or changes in robot behavior.
Feedback mechanisms, like haptic feedback or on-screen displays, enable humans to receive
real-time information about the robot’s actions and status. These communication tools not
only enhance safety by keeping humans informed but also promote efficient coordination and
synergy between humans and robots, ultimately leading to more productive and harmonious
collaboration in the industrial manufacturing environment.
4. Task Allocation: Effective task allocation is a key strategy in optimizing HRC
within industrial manufacturing environments. By leveraging the strengths of both humans
and robots, organizations can maximize efficiency and productivity. Robots excel at repet-
itive and physically demanding tasks, where precision and consistency are critical. These
tasks can include assembly, welding, and material handling. On the other hand, humans
possess cognitive abilities that make them well-suited for complex decision-making, problem-
solving, and adaptability. [18][19]By assigning tasks based on these strengths, HRC not only
streamlines operations but also improves job satisfaction for human workers, who can fo-
cus on tasks that require creativity and critical thinking, while robots handle the routine
and physically demanding aspects of production. This strategic allocation of tasks fosters
a symbiotic relationship between humans and robots, resulting in a more efficient and agile
manufacturing process.
5. Adaptability: In the realm of HRC within industrial manufacturing, adaptability
and flexibility are paramount. HRC systems should be equipped with advanced sensors and
intelligent algorithms that enable robots to dynamically adjust their behavior in response
to the presence and actions of humans. These systems excel at detecting human presence,
tracking movements, and employing predictive algorithms to anticipate and accommodate
human actions. Clear communication channels and the capacity to learn and adapt fur-
May 3, 2024
ther enhance collaboration, fostering an environment where robots seamlessly complement
human capabilities. This adaptability not only ensures safety by preventing accidents but
also optimizes efficiency, making HRC systems agile and responsive to the ever-evolving
demands of the manufacturing landscape. Table 2 outlines the principles of human-robot
collaboration, encompassing descriptions, challenges, and mitigation strategies. It provides
a comprehensive overview of the fundamental aspects guiding interactions between humans
and robots in various contexts. Additionally, the table highlights common challenges en-
countered in human-robot collaboration and proposes mitigation strategies to address them
effectively. Through this structured presentation, Table 2 serves as a valuable resource for
understanding the dynamics and complexities of human-robot collaboration and fostering
successful partnerships in diverse environments.
[Table 2 about here.]
6. Transparency: Transparency in robot behavior and decision-making is a fundamental
aspect of safe and effective HRC in industrial environments. It entails providing human
operators with a clear insight into how robots process information, make decisions, and plan
actions. This transparency not only enhances trust but also enables human operators to
anticipate and adapt to robot behavior, reducing the risk of unexpected actions or errors.
By fostering an understanding of the robot’s cognitive processes and intentions, organizations
can create a collaborative ecosystem where humans and robots work together seamlessly,
improving overall efficiency, safety, and the quality of work in manufacturing settings.
7. Training and Education: Effective training and education for human operators
and maintenance personnel are pivotal for ensuring the safe and efficient integration of
robots in industrial settings. It is essential to provide comprehensive guidance on working
with robots, encompassing both safe practices and emergency procedures. This training
should cover topics such as understanding the capabilities and limitations of robots, oper-
ating and programming them, and recognizing potential hazards and risks associated with
HRC. Moreover, emergency response protocols should be thoroughly communicated, ensur-
ing that individuals know how to react in urgent situations, including how to stop robot
operations and summon assistance. By equipping personnel with the knowledge and skills
to navigate HRC scenarios safely, organizations can maximize the benefits of automation
while minimizing the associated risks, ultimately fostering a secure and productive working
environment.
8. Continuous Monitoring: The implementation of real-time monitoring in HRC sys-
tems stands as a pivotal pillar of safety and efficiency within industrial environments. This
continuous surveillance, driven by advanced sensor technology and data analytics, serves as a
proactive guardian against anomalies and safety concerns. [20][21]It enables swift responses
to deviations from expected behavior, preventing potential accidents, minimizing downtime,
and ensuring peak performance. Moreover, real-time monitoring supports ongoing improve-
ment efforts by harnessing data insights, fostering a secure and productive synergy between
humans and robots in the dynamic landscape of industrial manufacturing.
9. Collaborative Workspace Design: The design of workspaces in industrial settings
holds a critical role in promoting safe and effective HRC. Employing physical barriers, clearly
May 3, 2024
designated zones for humans and robots, and ergonomic considerations, these spaces facili-
tate an environment where humans and robots can work together seamlessly. By enhancing
visibility, reducing the risk of collisions, and ensuring ergonomic comfort, such thoughtfully
designed workspaces not only optimize productivity but also prioritize the safety and well-
being of all stakeholders, thereby harnessing the full potential of automation in industrial
manufacturing.
10. Emergency Stop: The installation of emergency stop buttons or mechanisms is a
vital safety provision within HRC environments. These readily accessible controls empower
human operators to instantly and decisively halt robot operations in response to unforeseen
emergencies or hazardous situations. [22][23]This redundancy in control not only underscores
the importance of human safety but also acts as a fail-safe mechanism, preventing potential
accidents, injuries, and damage to equipment. It ensures that human workers retain ultimate
authority and intervention capability, bolstering the overall security and confidence in HRC
systems, and ultimately fostering a safer and more efficient industrial work environment.
11. Feedback and Improvement: Collecting feedback from human operators is a
vital component of continuously improving HRC systems in industrial environments. This
feedback loop enables organizations to identify and address issues or challenges that may
arise during collaboration effectively. [24][25]By actively involving human operators in the
improvement process, it fosters a culture of collaboration and safety. Regular feedback can
reveal insights into system performance, usability, and safety concerns, allowing for timely
adjustments, refinements, and enhancements. This iterative approach not only optimizes
the HRC experience but also contributes to the ongoing evolution of systems that are more
efficient, safer, and better aligned with the needs and expertise of human operators.
12. Compliance with Standards: Ensuring compliance with relevant safety and in-
dustry standards is imperative to guarantee the high level of safety and performance required
in HRC systems within industrial settings. These standards, such as ISO 10218 and ISO
13482, provide essential guidelines and benchmarks for the design, operation, and main-
tenance of HRC systems. Adherence to these standards not only helps prevent accidents
and hazards but also promotes consistency and interoperability across the industry. By
rigorously complying with established safety and performance standards, organizations can
instill confidence in their HRC systems, safeguard human operators, and uphold the reliabil-
ity and effectiveness of these collaborative automation solutions in the demanding landscape
of industrial manufacturing.
13. Ethical Considerations: In non-industrial contexts like healthcare or service
robotics, the ethical implications of HRC systems loom large, demanding careful considera-
tion of privacy and data security. Safeguarding sensitive user information, ensuring confiden-
tiality, and respecting personal boundaries during human-robot interactions are paramount.
Robust data security measures, such as encryption and access controls, must be implemented
to protect against unauthorized access or data breaches. Transparent data handling prac-
tices and informed consent procedures are equally crucial to respect individual rights and
maintain trust, reinforcing the need for responsible and ethical deployment of HRC systems
that prioritize the privacy and dignity of users in these sensitive settings.
14. Human-Centric Design: Prioritizing user-friendly and intuitive design in robots
May 3, 2024
and interfaces is foundational for the success of HRC systems. Such design principles stream-
line interactions between humans and robots by minimizing complexity, reducing the learn-
ing curve, and aligning robot behavior with human expectations. This approach not only
enhances user acceptance but also empowers operators to work effectively and confidently
alongside robots, ultimately optimizing productivity and harnessing the benefits of automa-
tion across various industries.
15. Human Empowerment: Absolutely, HRC should be designed with the principle of
augmenting human capabilities rather than replacing them. Robots should serve as tools to
empower and enhance human workers, making them more productive and safe in their tasks.
By automating repetitive or physically demanding aspects of work, robots free up human
operators to focus on higher-level tasks that require creativity, problem-solving, and decision-
making. This collaborative approach not only maximizes efficiency but also leverages the
unique strengths of both humans and robots, resulting in a more harmonious and effective
workforce. These principles provide a foundation for the responsible and effective integration
of robots into human environments, ensuring that HRC systems contribute to increased
productivity, safety, and overall well-being. Figure 2 presents a comprehensive architecture
diagram, visually depicting the various modules and the flow of data within the system. The
architecture is designed to illustrate the interconnectedness and functionality of different
components within the system. Each module is represented as a distinct entity, with arrows
indicating the flow of data between them. The diagram provides a clear overview of how
data moves through the system, from input sources to processing modules and ultimately
to output or action. This visual representation helps stakeholders understand the system.

4. Robot Safety
Different forms or kinds of collisions between robots and humans in industrial settings,
along with their corresponding critical contact force values, can vary based on the specific
circumstances and safety standards. However, here are some general categories of collisions:
1. Unconstrained Impacts: These are unintended collisions between a robot and a
human that occur without any prior safety measures. The critical contact force value for
such impacts may depend on factors like the robot’s speed and the specific application.

[Table 3 about here.]

2. Clamping in the Robot Structure: This involves a human getting caught or clamped
within the moving parts or structure of a robot. The critical contact force value depends on
the design and force-limiting features of the robot.
3. Constrained Impacts: These are collisions that occur when a robot and a human are
interacting within a predefined workspace. The critical contact force value may be defined
based on safety standards to ensure human safety during such interactions.
4. Partially Constrained Impacts: These are situations where there is limited con-
straint on the motion of the robot, but contact with a human may still occur. [27][28] The
critical contact force value would depend on the level of constraint and the potential risks
involved.
May 3, 2024
Figure 2: A visual representation of the comprehensive architecture, illustrating the distinct modules and
the flow of data. Derived from [26]

5. Secondary Impacts: These refer to additional impacts that may occur as a result
of the initial collision between a robot and a human. The critical contact force values for
secondary impacts would be considered in the context of overall safety.
These are various International Organization for Standardization (ISO) standards related
to robotics and the safety requirements associated with them. Each standard serves a specific
purpose in defining terminology, safety requirements, or guidelines for the design, integration,
and use of robots and robotic devices. Here’s a brief explanation of each:
1. ISO 8373: Vocabulary for robots and robotic devices- This standard provides
a comprehensive vocabulary and terminology related to robots and robotic devices, helping

May 3, 2024
to establish a common language for the industry.
2. ISO 10218-1: Safety requirements for industrial robots - Part 1: Robots:
ISO 10218-1 sets safety requirements and guidelines for the design and operation of individual
industrial robots, focusing on their safety aspects.
3. ISO 10218-2: Safety requirements for industrial robots - Part 2: Robot
systems and integration: This standard extends the safety requirements outlined in ISO
10218-1 to cover the broader context of robot systems and their integration into industrial
processes.
4. ISO 15066: Robots and robotic devices - Collaborative robots: ISO 15066
addresses safety considerations specifically for collaborative robots (cobots), which are de-
signed to work safely alongside humans. It provides guidance on risk assessment, force and
pressure limits, and other safety-related aspects of cobots.
5. ISO 13849-1: Safety of machinery: This standard deals with the design principles
for safety-related parts of control systems in machinery, including robotic systems. [29][30]It
provides guidance on how to design control systems that ensure the safety of operators and
other personnel.
6. ISO 13849-2: Safety of machinery: Safety-related parts of control systems - Part
2: Validation- ISO 13849-2 complements Part 1 by addressing the validation and verification
processes for safety-related control systems. It ensures that the designed safety measures
are effective and reliable.
7. ISO 10270: Manipulating industrial robots: Similar to ISO 8373, ISO 10270
focuses on providing a specific vocabulary and terminology related to manipulating industrial
robots. It helps standardize terminology in this context.
8. ISO/TR 20218-1: Robots and robotic devices - Safety requirements for
industrial robots - Part 1: Criteria for the design of robot safety: This is a Technical
Report (TR) that offers criteria and guidelines for designing safe industrial robots. While
not a mandatory standard, it provides valuable insights for safety-conscious designers.
9. ISO/TR 20218-2: Robots and robotic devices - Safety requirements for
industrial robots - Part 2: Robot integration and use: Similar to Part 1, this
Technical Report focuses on robot integration and use, offering criteria and guidance for
safe integration and operation.
10. ISO/TS 15066: Robots and robotic devices - Collaborative robots - Safety:
ISO/TS 15066 is a Technical Specification (TS) that provides additional safety guidance
specifically for collaborative robots. It outlines safety measures and requirements for cobots
in more detail.
These ISO standards and technical documents play a crucial role in ensuring the safety,
interoperability, and standardization of robotics technology across various industries and
applications. They help manufacturers, integrators, and users of robotic systems adhere
to best practices and reduce the risk of accidents and injuries associated with robotic op-
erations. Figure 3 provides a regulatory perspective on the interplay between machinery,
robots/collaborative robots, and artificial intelligence (AI). The diagram visually illustrates
the complex relationship and interactions among these key elements within the regulatory
framework. Each component, including machinery, robots (both traditional and collabora-
May 3, 2024
Figure 3: The regulatory perspective on the interplay between machinery, robots/collaborative robots, and
AI. Derived from [31]

tive), and AI systems, is represented with distinct symbols or icons. Arrows and lines depict
the flow of regulatory requirements, standards, and guidelines between these components,
highlighting how regulations apply to each element individually and their interconnected-
ness. This visual representation aids in understanding the regulatory landscape governing
the integration of machinery, robots, and AI in various applications. It helps stakeholders,
including policymakers, manufacturers, and researchers, navigate the regulatory environ-
ment and ensure compliance with relevant standards and regulations in the development
and deployment of advanced technological systems. Table 3 offers a comprehensive overview
of robot safety, detailing various aspects, hazards, and precautionary measures associated
with robotics applications. It provides a structured framework for understanding the safety
considerations inherent in working with robots across different industries and contexts. The
table delineates potential hazards posed by robots and machinery, along with corresponding
precautionary measures to mitigate risks and ensure safe operations. By organizing this
information in a clear and systematic manner, Table 3 serves as a valuable reference for

May 3, 2024
practitioners, researchers, and policymakers involved in robotics safety and risk manage-
ment.

4.1. Enhancing Safety and Intuitive Interaction: Control Strategies and Human-Robot Com-
munication Techniques in Collaborative Workspaces
To ensure safe and intuitive interaction between human workers and robots in various
industrial and collaborative settings, it’s essential to implement effective control strategies
and human-robot communication techniques. Here are some key strategies and techniques
that are equally necessary for achieving this goal:
1. Risk Assessment and Safety Measures: Conduct a thorough risk assessment
to identify potential hazards and risks associated with robot-human interactions. [31][32]
Implement safety measures such as physical barriers, safety sensors, emergency stop buttons,
and safety certifications to minimize risks.
[Table 4 about here.]
2. Collaborative Robotics (Cobots): Use collaborative robot designs that are inher-
ently safe for human interaction, featuring softer materials, rounded edges, and lightweight
construction. Implement force and torque sensors to enable robots to detect and respond to
unexpected contact with humans.
3. Programming and Control: Utilize intuitive programming interfaces and control
systems that enable non-expert users to teach robots tasks easily. Implement motion plan-
ning algorithms that consider human presence and safety constraints to ensure smooth and
safe motion.
4. Sensing and Perception:- Equip robots with advanced sensors such as 3D cameras,
LIDAR, and tactile sensors to detect and recognize humans and objects in their environ-
ment. Implement computer vision and machine learning algorithms to enhance perception
capabilities, allowing robots to understand human gestures, poses, and intentions.
5. Collision Avoidance: Develop collision avoidance algorithms that enable robots
to dynamically adjust their paths and speed to avoid collisions with humans and obsta-
cles. [32][33][34] Use sensors and software to create safety zones around robots, triggering
slowdowns or stops when humans enter these zones.
6. Natural Language Interfaces: Implement natural language processing (NLP) and
speech recognition technologies to allow humans to give voice commands or interact with
robots through speech. Enable robots to respond to verbal cues, questions, and instructions
in a user-friendly manner.
7. Haptic Feedback: Integrate haptic feedback mechanisms into robots to provide
tactile feedback to human operators, allowing them to sense forces, pressures, or vibrations
during interaction. This can improve teleoperation and enhance the operator’s sense of
touch.
8. Augmented Reality (AR) and Virtual Reality (VR): Use AR and VR interfaces
to provide real-time information to human workers, such as overlaying instructions, safety
alerts, and visualizations on their field of view. These technologies can enhance situational
awareness and task guidance.
May 3, 2024
9. Training and Education: Provide comprehensive training for human workers on
how to safely and effectively interact with robots. Include simulation-based training and
realistic scenarios to prepare workers for real-world robot collaborations.
10. Continuous Monitoring and Adaptation: Implement systems that continuously
monitor the robot’s environment, human behavior, and task progress. Enable robots to
adapt their behavior in response to changing conditions and unexpected events.
11. User Feedback and Iteration: Encourage feedback from human workers and
incorporate their input into the design and operation of robots. Continuously iterate and
improve both the hardware and software based on user experiences and suggestions.

Figure 4: A comprehensive description of the levels of human-robot interaction, providing a detailed break-
down of the different stages or degrees of interaction between humans and robots. This description should
encompass the spectrum of interaction levels, from minimal or passive engagement to high levels of collab-
oration and cooperation, as well as the corresponding implications for technology, safety, and productivity.
Derived from [35]

By combining these control strategies and human-robot communication techniques, orga-


nizations can create a safe, intuitive, and productive working environment where humans and
robots collaborate effectively. Additionally, staying up-to-date with the latest advancements
in robotics and human-robot interaction technologies is crucial for ongoing improvement
May 3, 2024
in this field. Figure 4 provides a comprehensive overview of the levels of human-robot in-
teraction, highlighting the implications for technology development, safety protocols, and
productivity enhancement in various human-robot collaboration scenarios. Table 4 presents
strategies for advancing safety and interaction in collaborative workspaces, focusing on con-
trol strategies and human-robot communication techniques. It outlines various approaches
aimed at enhancing the safety of human-robot collaboration and improving communication
between humans and robots in shared work environments. The table provides insights into
control methods, such as safety protocols and motion planning algorithms, as well as com-
munication techniques, including gesture recognition and natural language processing. By
synthesizing these strategies, Table 4 offers a comprehensive resource for designing and im-
plementing effective systems for collaborative workspaces, fostering productivity, safety, and
seamless interaction between humans and robots.

4.2. Control Strategies


Control schemes in the context of HRC can also be categorized into two types: pre-
collision control schemes and post-collision control schemes. These categories focus on how
control is managed before and after a potential collision or interaction between the human
and the robot. Let’s delve deeper into each type:
1. Pre-Collision Control Schemes: Pre-collision control schemes aim to prevent
or minimize the likelihood of collisions or undesirable interactions between humans and
robots before they occur. Robots using pre-collision control schemes proactively plan their
movements and actions to avoid areas or trajectories that could lead to collisions with
humans. They use predictive models, obstacle detection, and path planning algorithms to
ensure safe navigation. Establishing predefined safety zones around the robot where human
access is restricted or regulated is a common approach. These zones are typically defined
based on the robot’s workspace and are monitored to ensure that humans do not enter
dangerous areas while the robot is in operation. Limiting the robot’s speed and force output
in its movements is another preventive measure. Slower and less forceful movements reduce
the severity of potential collisions and enhance safety.

[Table 5 about here.]

2. Post-Collision Control Schemes:


Post-collision control schemes come into play after a collision or contact between the human
and the robot has occurred, with the aim of minimizing harm or damage. Robots equipped
with force sensors can detect a collision and immediately limit their force output. Com-
pliance control allows the robot to become more flexible or compliant when contact with a
human is detected, reducing the risk of injury. E-stop systems are designed for quick and im-
mediate response to emergencies. When a collision or unsafe situation is detected, an E-stop
button or system can be pressed or activated to halt all robot movements instantaneously.
[35][36][37] In some cases, after a collision, the robot can autonomously recover from the
event. This may involve retracting or adjusting its position to alleviate any potential harm
or damage.
May 3, 2024
Figure 5: The four collaboration modes defined in accordance with the robot safety standards ISO 10218-1/2
[22], [23], as well as the technical specification ISO/TS 15066, are as follows [24].

Figure 5 outlines the four collaboration modes defined by robot safety standards ISO
10218-1/2 and ISO/TS 15066. These modes dictate how humans and robots interact in
industrial settings to ensure safety and efficiency. These pre-collision and post-collision
control schemes are essential components of ensuring safe and effective human-robot collab-
oration. An effective control strategy often combines elements of both approaches to provide
a comprehensive safety framework. The specific choice of control scheme depends on the
application, the level of human-robot interaction required, and the safety standards and
regulations governing the system. Table 5 delves into the exploration of robotics, encom-
passing studies in robot models, sensor technologies, and human-robot interaction (HRI). It
offers a comprehensive overview of research endeavors aimed at advancing various aspects of
robotics technology. The table highlights developments in robot design, sensor integration,
and HRI methodologies, providing insights into emerging trends and innovations in the field.
By organizing this information in a structured format, Table 5 serves as a valuable reference
for researchers, engineers, and enthusiasts interested in the multifaceted domain of robotics,
fostering collaboration and knowledge exchange in the pursuit of technological advancement.

4.3. Workspace Restrictions


To prevent any issues when implementing control strategies, it becomes essential to im-
pose limitations on the workspace. Various techniques have been developed to enforce these
restrictions, each with its unique approach. Initially, Kimmel et al. [38] introduced a method
May 3, 2024
that combines a Cartesian constraint with an invariance control scheme and a discrete-time
Euler solver. This method effectively reduces oscillations when the manipulator encounters
constraints. Similarly, Rauscher et al. [39] successfully applied Cartesian workspace restric-
tions to a redundant robot by utilizing an impedance control strategy in conjunction with
control barrier functions and quadratic programming. Dimeas et al. [40] devised a tech-
nique to prevent operators from pushing the manipulator into configurations that hinder its
performance. This method relies on virtual constraints and a Cartesian admittance control
scheme, which adjusts according to the manipulator’s kinematic manipulability index [40].
Han et al. [41] developed an operational-space-control (OSC) framework capable of han-
dling joint limits and singularities gracefully. Building on the energy-aware control scheme
proposed by Raiola et al. [42], Hjorth et al. [43] extended it by incorporating the con-
cept of artificial potential fields. This extension effectively enforces workspace restrictions
for collaborative robots in a compliant state, as illustrated in Figure 5. Flacco et al. [44]
introduced a technique that saturates the manipulator’s nullspace by combining the Stack-
of-Tasks approach with quadratic programming. This approach is adaptable for restricting
the manipulator’s workspace by maintaining hard constraints on its positions, velocities,
and accelerations within its configuration space. Furthermore, Muñoz Osorio et al. [45]
expanded upon this method and transformed the algorithm into a torque-based approach.
They combined the Operation Space Control formulation with the stack-of-task technique to
create high-priority tasks in the task stack, enabling precise control over the manipulator’s
motion within both the Cartesian and configuration space. [46]

4.4. Human–Robot Communication


The field of human-robot communication is rapidly evolving, encompassing verbal and
non-verbal communication, as well as interactions in virtual, augmented, and mixed re-
alities. Verbal communication involves spoken language and text-based exchanges, facil-
itated by technologies like natural language processing and speech recognition, enabling
intuitive human-robot collaboration. Non-verbal cues, such as gestures and facial expres-
sions, enhance the emotional connection between humans and robots, particularly in fields
like healthcare. In virtual reality, immersive interactions occur within digital environments,
while augmented reality overlays digital information onto the physical world, offering con-
textual assistance. Mixed reality seamlessly blends physical and digital realms for dynamic
interactions, relevant in areas like remote maintenance and collaborative design. This multi-
faceted domain explores how humans and robots exchange information, promising innovation
across various sectors. Figure 6 provides an overview of various input modes considered in
human-robot collaboration, accompanied by an examination of their respective advantages
and disadvantages.
Verbal interaction is a collaborative process, where participants strive to establish a
mutual understanding of their shared knowledge base. This shared comprehension, often
referred to as common ground, can be categorized into two forms: communal common
ground, representing universally held knowledge, and personal common grounds, encom-
passing knowledge drawn from personal experiences. Personal common ground evolves as
individuals contribute new information, enabling participants in the conversation to arrive
May 3, 2024
Figure 6: An overview of various input modes considered in human-robot collaboration, including an exam-
ination of their respective advantages and disadvantages. Derived from [47]

at a shared conviction, a concept known as grounding. Grice has previously noted that
grounding is achieved when individuals efficiently convey information without undue effort.
[47] Past research has demonstrated the benefits of establishing grounding through verbal
communication, even when combined with other forms of feedback. For instance, Wang and
colleagues [48] found that the effectiveness of haptic communication improved after dyads
had an initial learning phase involving verbal communication to acquaint themselves with
the task. In more complex tasks, studies like that of Parikh and team [49] have suggested
that verbal feedback, when combined with haptic feedback, can significantly enhance team
performance compared to relying solely on haptic feedback.

[Table 6 about here.]

Verbalization, in general, offers greater flexibility compared to haptic feedback, as it enables


the communication of abstract and intricate concepts, fostering a shared understanding of
the task. However, it is important to acknowledge that verbal communication comes with its
own set of costs, in terms of time and cognitive resources. Formulating coherent utterances,
especially when discussing unfamiliar subjects or ideas, can be time and effort-intensive.
Additionally, message recipients bear the costs of receiving and comprehending a message,
particularly when contextual cues are absent, necessitating inferences. Consequently, after
teams have established a shared understanding of a task, it may be advantageous to transi-
tion to a less resource-intensive mode of communication, such as haptic feedback. Studies,
May 3, 2024
such as the work by Kucukyilmaz and his team [50], have highlighted that haptic feedback
can enhance the perceived sense of presence and collaboration, thereby facilitating interac-
tion. This mode of communication has proven especially effective in tasks involving deictic
referencing and the manipulation of physical objects. Table 6 provides insights into digital
twins, focusing on sector-specific applications, challenges, and simulation parameters. It of-
fers a detailed examination of how digital twin technology is utilized across various sectors,
including manufacturing, healthcare, and smart infrastructure. The table also highlights
common challenges encountered in the implementation of digital twins, such as data in-
tegration and model fidelity, along with simulation parameters used to enhance accuracy
and effectiveness. By organizing this information in a structured format, Table 6 serves as
a valuable resource for understanding the diverse applications and complexities of digital
twins, guiding researchers, practitioners, and policymakers in leveraging this transformative
technology effectively.
The impact of verbal communication in human-robot teams on collaboration and the
perception of robots has been well-documented in previous research. Typically, robot dialog
systems have primarily supported communication initiated either by humans or robots in
the form of requests. A significant challenge in this context has been the generation of
comprehensible verbal commands, often related to symbol grounding, which pertains to a
robot’s ability to connect symbols to physical objects in the real world. Recent research
by Tellex and colleagues [51] introduced a model for deducing plans from natural language
commands, and inverting this model allows a robot to recover from failures by using natural
language to communicate the need for assistance to a human partner. Furthermore, work by
Khan and Wang [52] has explored generating explanations regarding the robot’s beliefs and
decision-making processes. This research has been extended to various robot controllers, as
demonstrated by Hayes and Shah [53]. It’s important to note that in our specific context,
which involves human-robot collaboration in a physical setting, there are unique challenges.
Unlike autonomous driving, where the "how" and "why" of actions have distinct effects,
humans may have difficulty verifying the accuracy of a robot’s "why" actions. Additionally,
unlike the driving scenario, there may not be a definitively correct course of action for
the robot to take in a physical human-robot collaboration, leading to potential uncertainty
and disagreements. This context highlights the significance of examining the role of verbal
communication in guiding human-robot interactions.
Non-verbal communication in the context of human-robot interaction systems refers
to the exchange of information between humans and robots without using spoken or written
language. It encompasses a wide range of non-linguistic cues and signals that humans and
robots use to convey and interpret messages, emotions, intentions, and other aspects of
communication. Non-verbal communication plays a crucial role in enhancing the quality of
interaction between humans and robots and can include the following components:
1. Body Language: This involves the use of gestures, postures, and movements to
convey information. For instance, a robot may use arm movements to signal that it is ready
to assist, or a human may use hand gestures to indicate a specific action.
2. Facial Expressions: Robots equipped with screens or facial features can display a
range of facial expressions, such as smiles, frowns, or raised eyebrows, to convey emotions
May 3, 2024
and intentions. These expressions can make interactions with robots more relatable and
emotionally engaging.
3. Eye Contact: Eye contact is a powerful non-verbal cue that can signal attention,
interest, or intent. In human-robot interactions, robots can use their sensors or screens to
establish eye contact with users, making the interaction more personal.
4. Proximity and Personal Space: Understanding personal space is crucial for smooth
human-robot interactions. Robots need to respect the physical boundaries of humans and
maintain an appropriate distance, which can vary depending on cultural norms and the
specific context.
5. Touch and Tactile Feedback: Tactile feedback, such as a robot’s ability to respond
to touch or provide haptic feedback, can enhance the perception of physical presence and
facilitate collaborative tasks. For instance, a robot might use gentle vibrations to signal a
user or provide feedback during a task.
6. Emotional Expression: Robots can be designed to exhibit emotions or emotional
cues through their physical appearance, movements, and sounds. This allows them to convey
empathy, comfort, or concern in appropriate situations.
7. Voice Tone and Pitch: While not strictly non-verbal, the tone and pitch of a robot’s
voice can convey emotional content. For instance, a robot might use a soothing tone when
providing assistance or a more urgent tone in emergencies.
8. Non-Verbal Feedback Recognition: Robots can be programmed to recognize
and respond to human non-verbal cues. For example, they can detect a user’s frustration
through facial expressions or body language and adjust their responses accordingly.
Effective non-verbal communication in human-robot interaction systems is essential for
making interactions more natural, intuitive, and emotionally engaging. It can help users feel
more comfortable and enhance their understanding of the robot’s actions and intentions. The
development and integration of non-verbal communication capabilities in robots are impor-
tant for creating more user-friendly and efficient human-robot collaborative environments,
especially in areas like healthcare, customer service, and social robotics.
Beyond operator gestures, there is also research into robots generating their own gestures
to communicate intentions effectively. Sheikholeslami et al. investigated the efficiency of
robot hand configurations in cooperative industrial tasks, finding that robots can commu-
nicate intentions robustly [49]. Gleeson et al. developed a lexicon of communicative terms
and robot gestures for commonly used industrial tasks, proving its adequacy for intuitive
and efficient human-robot communication [50]. In addition to gesture and pose recognition,
several methods track operator gaze and attention to enhance human-robot communication
in industrial HRC. Eye gaze tracking has been explored as a means of interaction, and its su-
periority over head tracking has been demonstrated in some cases [51]. Eye gaze tracking has
been applied in various industrial contexts, improving the robustness of robot manipulation
tasks, assessing operator comfort levels, and optimizing human-robot handover tasks [52].
Figure 7 depicts the factors governing physical safety control in the context of human-robot
collaboration, as referenced in [47] and [48].
Moreover, alongside gaze tracking, researchers have explored the use of tactile and haptic
feedback to enhance communication with industrial robots. Casalino et al. [54] devised a
May 3, 2024
Figure 7: The factors governing physical safety control in the context of human-robot collaboration, as
referenced in [47] and [48].

method that provides tactile feedback directly to the operators’ fingertips to monitor their
operational awareness. They employed a Bayesian recursive classifier to estimate human in-
tention while using a wearable vibrotactile ring to convey information about different stages
of Human-Robot Collaboration (HRC) [55]. Salvietti et al. [56] investigated a bilateral
haptic interface, combining a soft gripper with a wearable remote ring interface to enhance
collaboration effectiveness. Bergner et al. [57] introduced an innovative interface featuring
distributed cells acting as a large-scale "skin" around robot manipulators. This interface
calculates joint torque in contact points, enabling more intuitive human-robot communica-
tion through touch alone [58]. Similarly, Tang et al. [59] developed a novel signaling system
using robot light skin, which significantly improved user reaction times and reduced operator
mental workload during the execution of simple industrial tasks, resulting in fewer errors.
Human-Robot Communication in Virtual, Augmented, and Mixed Realities:
Efficient collaboration between humans and robots necessitates that robots exhibit behav-
iors that are sensitive to human presence and engage in effective communication throughout
the entire process. This enables the establishment of a common ground where shared be-
liefs, desires, and intentions can be understood by all parties involved. Nonetheless, the
present state of natural language processing significantly constrains the breadth of these
interactions. The limitations arise from disparities in how humans and robots represent and
convey information, which, in turn, hinder the robot’s ability to interpret human intentions
and express their own intentions meaningfully. These communication barriers have been a
central focus of previous research in human-robot interaction.
The importance of addressing these barriers becomes especially evident in collaborative
scenarios where mixed human-robot teams work together for reasons related to safety, ef-
ficiency, and ease of use. This perspective is underscored in the U.S. Robotics Roadmap,
which underscores the need for humans to comprehend and discern robot activities to inter-

May 3, 2024
pret the robot’s comprehension of the situation. Recent research in the field of human-robot
interaction has attempted to tackle this challenge through various means. These approaches
include generating motion that conveys intent, creating understandable task plans, using
natural language to articulate intentions, and employing explicit signaling mechanisms. Re-
cent advancements in mixed, augmented, and virtual reality offer an alternative avenue for
facilitating human-robot interactions by utilizing the shared physical space as a canvas for
visual cues to communicate intent. While preliminary work in this domain dates back sev-
eral years (e.g., Interactive Hand Pointer for robot control in a human workspace), recent
technological progress in mixed and virtual reality systems has significantly improved the
viability of this approach. Recent systems have been developed to visualize the paths of
mobile wheelchairs and robots, with results indicating that humans prefer to engage with
a robot when it directly presents its intentions through visual cues. However, most recent
research has primarily introduced passive, non-interactive systems with limited scopes, and
they have not embraced a genuine mixed reality perspective by accounting for the chang-
ing environmental context while projecting information. The recent strides in augmented
and virtual reality have expanded the horizons of possibilities within these environments,
enabling researchers to effectively incorporate augmented reality into human-robot interac-
tions, thus offering fresh avenues for exploration and development in this field.

5. Overview of Smart Manufacturing and Industry 4.0


Industry 4.0, initiated by the German government in 2011 and unveiled at Hanover Messe
[60], represents the fourth industrial revolution, evolving from its predecessors. The first in-
dustrial revolution introduced mechanized power through steam and water, while the second
brought mass production on assembly lines. In the late 1960s, the third industrial revolution
was marked by programmable logic controllers and automation. Now, Industry 4.0 signifies
the digitalization of manufacturing, aiming to enhance efficiency and productivity through
transformative technologies [15]. Key enablers include Cyber Physical Systems (CPS) and
Cyber Physical Production Systems (CPPS), Digital Twin technology, the Internet of Things
(IoT), advanced robotics, collaborative robotics (cobotics), mobile robotics (mobotics), as
well as artificial intelligence (AI) and machine learning.

5.1. Cyber Physical Systems


Cyber Physical Systems (CPSs) and Cyber Physical Production Systems (CPPSs) play
a pivotal role in the intelligent control of modern manufacturing environments. These sys-
tems are characterized by their intricate networks of interconnected computational entities,
seamlessly bridging the gap between the physical and digital realms [15]. CPSs, at the heart
of Industry 4.0, are instrumental in orchestrating the efficient functioning of CPPSs. They
form the backbone of smart manufacturing, providing the means to monitor, control, and
optimize manufacturing and production systems on both physical and cyber levels. CPSs
integrate real-time data from the physical world with digital information, enabling informed
decision-making and automated responses [15].

May 3, 2024
Figure 8: Production facility equipped with Cyber-Physical Systems (CPS). Derived from [15]

In the context of CPPSs, the intelligent control of these production systems is made
possible through embedded networked systems and the Internet of Things (IoT). These
components facilitate the seamless exchange of data and communication between various
elements of the manufacturing environment, from machines and sensors to software and hu-
mans. This interconnectedness empowers CPPSs to adapt to changing conditions, optimize
processes, and enhance overall manufacturing efficiency [15]. The synergy between CPSs
and CPPSs represents a significant step towards achieving the goals of Industry 4.0, where
digitalization and intelligent control converge to create agile and responsive manufacturing
ecosystems. This integration not only improves operational efficiency but also paves the
way for innovative business models, ultimately driving the evolution of modern manufactur-
ing. Figure 8 depicts a production facility enhanced with Cyber-Physical Systems (CPS),
integrating physical processes with computational capabilities. CPS enables real-time data
collection, analysis, and automation, optimizing manufacturing operations. Key features in-
clude sensor networks for data gathering, analytics for decision-making, and connectivity for
seamless integration. Automation streamlines workflows, while human-robot collaboration
ensures efficiency. Remote monitoring allows oversight from anywhere, supporting adaptive
manufacturing to respond to changing demands. This visualization underscores CPS’s role
in enhancing efficiency, flexibility, and competitiveness in modern manufacturing.

5.2. Digital Twin


Digital twin technology stands as a cornerstone of Industry 4.0, serving as a bridge
between the virtual and physical realms within manufacturing units. A digital twin is es-
sentially a virtual replica or representation of a physical system, be it a product, process, or
an entire manufacturing facility. It operates by mirroring real-world data and processes in a
digital environment, offering a comprehensive and real-time view of the physical counterpart
[3]. The functioning of a digital twin involves the continuous synchronization of data between
May 3, 2024
the physical system and its digital twin counterpart. Sensors and data acquisition systems
collect real-time information from the physical environment, transmitting this data to the
corresponding digital twin. Conversely, the digital twin processes and analyzes the data,
generating valuable insights and predictions. This synchronized exchange of information
enables manufacturing units to analyze data, monitor production processes, preemptively
manage risks, reduce downtime, and even conduct simulations for further development [3].
The value of digital twins in manufacturing cannot be overstated. They serve as powerful
tools for optimizing operations and enhancing decision-making. By leveraging manufac-
turing data from the production process, digital twins can significantly reduce downtime.
They achieve this by offering real-time monitoring and predictive maintenance capabilities,
allowing manufacturers to identify potential issues before they lead to costly disruptions.
Moreover, digital twins facilitate data-driven insights that enable continuous improvement,
leading to more efficient and responsive manufacturing processes [3]. In essence, digital twins
represent a transformative technology that empowers manufacturing units to enhance pro-
ductivity, reduce operational costs, and drive innovation. Their ability to provide a virtual
window into the physical world offers a competitive edge in the ever-evolving landscape of
Industry 4.0.

5.3. Internet of Things


The Internet of Things (IoT) and its industrial counterpart, the Industrial Internet of
Things (IIoT), represent crucial components of Industry 4.0’s digital transformation. These
technologies have reshaped the manufacturing landscape by enabling the seamless connectiv-
ity of devices, systems, and processes. IoT refers to the interconnected network of physical
devices equipped with sensors, software, and connectivity capabilities, allowing them to col-
lect and exchange data with each other and centralized systems over the internet. In manu-
facturing, IoT is commonly referred to as IIoT, emphasizing its industrial applications. IIoT
encompasses a wide range of assets, from machines and equipment to inventory management
systems and production lines. Wireless sensing technologies play a pivotal role within the
IIoT framework. These sensors are deployed throughout the manufacturing environment
to capture real-time data on various parameters, such as temperature, pressure, humidity,
and machine performance. Wireless connectivity enables these sensors to transmit data
seamlessly to central monitoring and control systems, facilitating rapid decision-making and
enabling predictive maintenance. In the context of Industry 4.0, wireless sensing technolo-
gies are instrumental in ensuring the safety and efficiency of human-robot interactions within
manufacturing environments. By continuously monitoring the position and movements of
both humans and robots, these sensors contribute to the creation of safe workspaces where
human-robot collaboration can occur without risk to human workers.
As robotics plays an increasingly prominent role in modern manufacturing, both station-
ary and mobile robots rely on wireless sensing technology for navigation, object detection,
and spatial awareness. This technology ensures that robots can operate alongside humans
in shared workspaces without compromising safety. By integrating wireless sensing into the
manufacturing ecosystem, manufacturers gain real-time visibility into their operations, lead-
ing to improved efficiency, reduced downtime, and enhanced safety. This connectivity also
May 3, 2024
Figure 9: The process delineates the instrumental role of IoT technology in revolutionizing Smart Manu-
facturing. It showcases how IoT facilitates data-driven processes, predictive maintenance, and overall op-
erational excellence, ushering in a new era of efficient and interconnected manufacturing systems. Derived
from [61]

extends to human workers, who can benefit from wearable devices equipped with sensors to
monitor their health, location, and safety within the manufacturing environment. Figure 9
delineates the instrumental role of IoT technology in revolutionizing Smart Manufacturing.
It showcases how IoT facilitates data-driven processes, predictive maintenance, and overall
operational excellence, ushering in a new era of efficient and interconnected manufacturing
systems. Through IoT integration, real-time data collection, and analytics, manufacturers
optimize operations, enhance product quality, and achieve greater efficiency. This visu-
alization underscores IoT’s transformative impact on manufacturing, driving productivity,
sustainability, and competitiveness in Industry 4.0.

5.4. Artificial Intelligence and Machine Learning


AI (Artificial Intelligence) and ML (Machine Learning) are pivotal components of Indus-
try 4.0, playing a central role in transforming manufacturing processes, enhancing efficiency,
and driving innovation. These technologies are instrumental in processing massive volumes
of data generated from sensors, machines, and various production processes in Industry 4.0.
They excel in discerning intricate patterns, anomalies, and trends within datasets, enabling
data-driven decision-making and quality enhancement. AI and ML are invaluable for qual-
ity control, defect detection, and process optimization through pattern recognition. [61][62]
Additionally, they enable computer vision systems to inspect and assess products, ensuring
adherence to quality standards. Predictive analytics, a significant advantage of AI and ML,

May 3, 2024
forecast equipment failures, maintenance requirements, and production bottlenecks, reduc-
ing downtime and operational disruptions. These technologies continuously adapt and op-
timize manufacturing processes, enhance customization, and promote safer human-machine
collaboration, thus driving the transformation of Industry 4.0 towards a more efficient and
innovative future of manufacturing.

5.5. Advanced industrial robotics and cobots


In Industry 4.0, advanced robotics play a pivotal role in automating tasks, enhancing
efficiency, and enabling flexible production processes. These robots are equipped with ad-
vanced sensors, actuators, and control systems that enable them to perform a wide range of
tasks with precision and adaptability. Collaborative robots, often referred to as cobots, are
a notable subset of advanced robotics designed to work in close collaboration with human
operators. What sets cobots apart from traditional industrial robots is their focus on safety
and human-robot interaction. Cobots are embedded with sensors that enable them to detect
the presence of humans in their vicinity, ensuring safe working conditions. These sensors
can detect impacts or force exerted on the cobot, triggering immediate braking to prevent
accidents [63]. This emphasis on safety and real-time interaction opens up new possibilities
for humans and robots to work side by side in shared workspaces within manufacturing
environments. Cobots are particularly well-suited for tasks that require human dexterity,
problem-solving, and adaptability, making them valuable assets in Industry 4.0’s quest for
efficient and collaborative manufacturing processes. Figure 10 outlines the foundational tech-
nologies and framework for Human-Centric Digital Twins in Industry 5.0. This depiction
highlights the integration of cutting-edge technologies such as AI, IoT, and digital modeling
to create digital replicas of physical systems and processes. Human-Centric Digital Twins
serve as virtual counterparts that incorporate human-centric design principles, enabling im-
mersive collaboration, personalized experiences, and enhanced decision-making. This visual-
ization underscores the evolution of digital twin technology towards a more human-centered
approach, shaping the future of Industry 5.0 with enhanced connectivity and symbiotic
human-machine interaction.

5.6. Autonomous Mobile Robot


In the landscape of Industry 4.0, mobile robots have emerged as a transformative force,
offering increased flexibility and efficiency in manufacturing environments. This evolution
in robotics can be traced from Automated Guided Vehicles (AGVs) to the more advanced
Intelligent Mobile Robots (IMRs) and Autonomous Mobile Robots (AMRs). AGVs were
among the earliest forms of mobile robots used in manufacturing. These vehicles followed
predetermined paths and were primarily used for material handling and logistics within
factories. However, their rigid navigation systems limited their adaptability and scalabil-
ity. Intelligent Mobile Robots (IMRs) represent the next step in the evolution of mobile
robotics. [64][65][66] Unlike AGVs, IMRs have the ability to navigate autonomously, adapt-
ing to dynamic environments without predefined paths. They employ technologies such as
Simultaneous Localization and Mapping (SLAM) to sense their surroundings, map their
environment, and make real-time decisions regarding navigation. IMRs are more versatile
May 3, 2024
Figure 10: The foundational technologies and framework for Human-Centric Digital Twins in Industry 5.0.
Derived from [62]

and can perform a wider range of tasks, including inspection, inventory management, and
even collaborating with human workers. Autonomous Mobile Robots (AMRs) take mobile
robotics a step further by integrating advanced AI and machine learning capabilities. These
robots can learn from their interactions with the environment and continuously improve
their performance. AMRs are highly adaptable and can be easily reconfigured for various
tasks, making them valuable assets in agile manufacturing setups.
However, with the rise of mobile robots in manufacturing, new challenges and issues arise
in the context of human-robot collaboration. Ensuring the safety of human workers sharing
the workspace with mobile robots is paramount. Effective communication between humans
and robots, as well as the development of standardized safety protocols, becomes crucial to
prevent accidents and ensure smooth collaboration. Additionally, addressing concerns about
privacy and data security in connected manufacturing environments is an ongoing challenge.
As the use of mobile robots continues to expand in Industry 4.0, addressing these challenges

May 3, 2024
and leveraging the full potential of IMRs and AMRs is essential for realizing the promise of
efficient and collaborative manufacturing processes.

5.7. Social challenges of Industry 4.0


As a whole, Industry 4.0 is primarily characterized by its emphasis on automation and
the integration of advanced technologies into manufacturing processes. The overarching
goal of Industry 4.0 is to enhance efficiency and productivity through the use of technolo-
gies like CPSs, Internet of Things (IoT), AI, and more. While Industry 4.0 may seem to
prioritize reducing human involvement in manufacturing, it’s important to recognize that
the technologies and applications developed within the framework of Industry 4.0 have sig-
nificant human-centric applications and implications. One example of these human-centric
applications is the development of smart assistance systems that leverage Augmented Re-
ality (AR) and Virtual Reality (VR) for training and upskilling of human workers. [67][68]
These systems enable immersive and interactive training experiences, allowing workers to
acquire new skills, troubleshoot complex machinery, or learn about advanced manufactur-
ing processes more effectively. By using AR and VR technologies, Industry 4.0 not only
improves manufacturing efficiency but also invests in the continuous development and em-
powerment of the human workforce. It’s worth noting that while Industry 4.0 is primarily
a technology-driven revolution aimed at optimizing manufacturing processes, Industry 5.0
represents a shift towards a more value-driven revolution. In Industry 5.0, the focus is not
just on automation and efficiency but on creating value through human-robot collaboration
and a deeper integration of human skills and capabilities into the manufacturing landscape.
[69][70][71] This transition from Industry 4.0 to Industry 5.0 reflects a growing recognition
of the importance of combining technology-driven advancements with human ingenuity and
creativity to achieve even greater outcomes in manufacturing and beyond.

6. Human-Centric Manufacturing
Human-centric manufacturing, as manifested in the context of Industry 5.0, is a paradigm
shift in industrial production that places humans at the center of manufacturing processes.
Unlike previous industrial revolutions, such as Industry 4.0, which emphasized automation
and the reduction of human involvement, Industry 5.0 acknowledges the unique capabilities
and creativity of human workers and seeks to leverage their skills in conjunction with ad-
vanced technologies. In the Industry 5.0 framework, the human worker is not replaced by
machines but is considered an integral part of the production process. This approach aims
to address several needs and challenges:
1. Social Impact: Industry 5.0 recognizes the social impact of automation and ad-
vanced technologies. It acknowledges the importance of providing meaningful employment
for humans and ensuring that technology benefits society as a whole. [72][73] This alignment
with societal values is crucial for maintaining a harmonious relationship between technology
and human well-being.
2. Sustainability: The sustainability of manufacturing processes is a primary concern
in Industry 5.0. By reintegrating human workers into production, it aims to create more
May 3, 2024
sustainable and environmentally friendly manufacturing practices. This includes reducing
waste, optimizing resource usage, and aligning production with eco-friendly principles.
3. Operator 4.0: The concept of Operator 4.0 is closely related to Industry 5.0. Opera-
tor 4.0 refers to a highly skilled, technologically empowered human operator who collaborates
with advanced machines and systems. It recognizes that the role of human workers is evolv-
ing from manual labor to skilled oversight and management of complex automated processes.
[74][75]
In essence, Industry 5.0 acknowledges that while automation and advanced technologies
are transformative and beneficial, they should not lead to the complete exclusion of humans
from the manufacturing process. Instead, it envisions a future where humans and machines
work in synergy, with each contributing their unique strengths. This approach not only
ensures the well-being of the workforce but also enhances overall manufacturing system
performance, making it more adaptable, sustainable, and aligned with societal values. It
represents a techno-social revolution where technology serves the needs of society while
recognizing the value of human creativity and skills in manufacturing.

6.1. Industry 5.0


While industry 4.0 is promising improved, more efficient cyber-physical manufacturing
systems, there is a concert regarding the social impact that this may have. Additionally,
there are further concerns regarding the impact this may have on sustainability, and thus,
industry 4.0 is somewhat misaligned with the current values of human society. Industry 5.0
co-exists with Industry 4.0[15], Industry 5.0 however aims to realign this mismatch, where
the relationship between manufacturing and social needs is better managed [17]. Some
authors see Industry 5.0 as the point where humans and machines work in symbiosis [18].
While many authors refer to this as Industry 5.0, there are ties to similar concepts such as
Society 5.0, Operator 4.0 and Operator 5.0 [17]. It has been highlighted that Industry 4.0
is a multi-disciplinary field, and this is also the case for Industry 5.0. Industry 5.0 includes
disciplines such as life sciences, social sciences, environmental science and humanities [17].
Before the first industrial revolution, the role of the human in the manufacturing process
was clear, that is, it was the human’s responsibility to manufacture the product. As the
manufacturing industry has evolved from mechanised production to mass production and
into automated production, Industry 3.0, the role of the human in the physical manufacturing
process has diminished. This phenomenon has been compounded in Industry 4.0, where the
human is further ostracised from the manufacturing process as CPPS take precedent. This
presents an issue, as humans can add significant value to manufacturing processes, however,
in industry 4.0 there is difficultly combining humans with CPPSs. Industry 5.0 however
aims to address this issue.
The European Commission recently released a policy brief on Industry 5.0, identifying
it as a evolvement of Industry 4.0, with a focus on three core values: resilience, sustain-
ability and human centricity [19]. It is believed that the EC wished to better integrate
environmental and social European policies with the technological developments of Industry
4.0[15]. This fifth industrial revolution is deemed by some authors as the human-centric

May 3, 2024
industrial revolution [16], which moves beyond that of industry 4.0 towards a more sustain-
able manufacturing domain. According to Nahavandi et al. [76], Industry 5.0 will bring
back human workers to the factory floors, by pairing human and machine to leverage hu-
man brainpower and creativity to increase process efficiency by combining workflows and
intelligent systems. It will take the focus off automation, and instead use intelligent systems
to support the synergy between human and machine [3]. This will see a shift away from
CPPS, and more towards human-cyber physical systems [16]. There are many definitions
as to what is Industry 5.0. Outlined in Table 1 [17]. Bringing the human into the loop
provides more fault-tolerance capabilities than fully automated systems on their own [17].
Human-centric manufacturing systems also have the value of improving overall manufac-
turing system performance, while also improving human well-being [17]. Industry 5.0 is
co-existing with Industry 4.0, and can be considered a Techno-Social revolution, that is, a
technology-enabled revolution driven by societal needs [15].

6.1.1. Operator 4.0


The Operator 4.0 paradigm represents a significant shift in the role and capabilities
of human operators in industrial settings. It emphasizes the creation of smart operator
workspaces by harnessing advanced technologies, including sensors, wearables, and sophis-
ticated algorithms. Operator 4.0 relies on the integration of advanced sensor technologies
within the operator’s workspace. These sensors can include a wide range of devices, such
as proximity sensors, motion detectors, and environmental sensors. They enable real-time
monitoring of the operator’s surroundings, collecting data on factors like temperature, hu-
midity, air quality, and equipment conditions. This data is crucial for ensuring the operator’s
safety, optimizing working conditions, and identifying potential issues in the environment.
Wearable devices play a pivotal role in Operator 4.0. These wearables can include smart
helmets, glasses, gloves, exoskeletons, and other wearable technology. They are equipped
with sensors, cameras, and communication capabilities. Wearables provide operators with
augmented reality (AR) displays for enhanced visualization of tasks, hands-free access to
information, and real-time communication with colleagues and control systems. This tech-
nology empowers operators with actionable data and support, making their tasks more effi-
cient and safe. Advanced algorithms and artificial intelligence (AI) systems are at the core
of Operator 4.0. They process the vast amount of data generated by sensors and wearables,
making sense of it in real-time. These algorithms can detect anomalies, predict equipment
failures, optimize workflows, and provide decision support to operators. Machine learning
models can adapt to changing conditions, improving productivity and reducing errors. The
primary goal of Operator 4.0 is to enhance both safety and efficiency in industrial environ-
ments. By equipping operators with wearable technology and smart workspaces, the risk
of accidents and injuries can be minimized. Operators receive immediate warnings about
potential hazards and can take preventive actions. Additionally, the efficiency of tasks is im-
proved, as operators have access to real-time information, reducing downtime and optimizing
processes. [77][78]
Operator 4.0 also fosters collaboration among operators, as they can communicate seam-
lessly through wearables and share insights and expertise. This collaborative approach
May 3, 2024
Figure 11: Leveraging Computer-Aided Design (CAD) models in the system development process, involving
the utilization of precise 3D representations to design and plan intricate systems with a focus on accuracy
and detail. Derived from [76]

enhances problem-solving capabilities and accelerates decision-making in complex manu-


facturing scenarios. Figure 11 illustrates the utilization of Computer-Aided Design (CAD)
models in the system development process. CAD models provide precise 3D representations
used to design and plan intricate systems with a focus on accuracy and detail. In this depic-
tion, CAD technology enables engineers and designers to create virtual prototypes, iterate
designs, and simulate real-world conditions before physical implementation. By leverag-
ing CAD models, stakeholders can optimize system performance, streamline manufacturing
processes, and minimize costly errors. [79] This visualization highlights the integral role of
CAD in modern engineering and design workflows, driving innovation and efficiency across
various industries.

6.1.2. Society 4.0


Society 4.0, often referred to as the "Fourth Industrial Revolution of Society," extends the
principles of Industry 4.0 into broader societal contexts, emphasizing the fusion of digital,
physical, and biological systems to transform various aspects of human life. This concept
reflects the profound impact of advanced technologies and digitalization on society as a
whole. Here are some key aspects and characteristics of Society 4.0:
1. Digital Transformation: Society 4.0 signifies the pervasive influence of digital
technologies in nearly every facet of human society. It involves the extensive use of data,
connectivity, and smart systems to enhance the quality of life, governance, and services.
2. Interconnectivity: Just as Industry 4.0 promotes the connectivity of machines and
systems in manufacturing, Society 4.0 emphasizes the interconnectedness of people, devices,
and infrastructure. The Internet of Things (IoT), 5G networks, and other technologies

May 3, 2024
enable seamless communication and data exchange. [80][81]

Figure 12: Utilizing a digital twin within an HRC system, where a virtual replica of the physical environment
and robotic components is created, allowing for real-time monitoring, analysis, and simulation to enhance
human-robot interaction and system performance. Derived from [80]

3. Smart Cities: One of the prominent applications of Society 4.0 is the development of
smart cities. These urban environments leverage technology to optimize resource manage-
ment, transportation, energy consumption, and public services. Sensors and data analytics
play a central role in creating more efficient and sustainable cities.
4. Data-Driven Decision-Making: Society 4.0 relies on data analytics and artificial
intelligence to inform decision-making processes. This data-driven approach extends to
various sectors, including healthcare, education, transportation, and governance, to enhance
efficiency and effectiveness.
5. Personalization: Just as Industry 4.0 enables mass customization in manufacturing,
Society 4.0 emphasizes personalization in various services and experiences. Algorithms and
AI systems tailor content, recommendations, and services to individual preferences and
needs.
6. Human Augmentation: Society 4.0 explores technologies that enhance human
capabilities, such as wearable devices, augmented reality (AR), virtual reality (VR), and
brain-computer interfaces. These technologies can improve healthcare, education, and en-
tertainment. [81]
7. Challenges: While Society 4.0 offers numerous opportunities, it also poses challenges
related to privacy, cybersecurity, ethics, and social inequality. The collection and use of vast
amounts of data raise concerns about data protection and surveillance.

May 3, 2024
8. Global Impact: Society 4.0 transcends national borders and has a global impact. It
promotes international collaboration and knowledge sharing, as solutions to complex global
challenges often require collective efforts.
9. Sustainability: Sustainability is a key consideration in Society 4.0, with an emphasis
on eco-friendly technologies and practices. Efforts are made to reduce environmental impact
and address climate change through digital solutions.
10. Inclusive Growth: A central goal of Society 4.0 is to achieve inclusive growth and
address societal disparities. Initiatives focus on bridging the digital divide, ensuring that
technology benefits all segments of the population.
Figure 12 demonstrates the utilization of a digital twin within a Human-Robot Col-
laboration (HRC) system. In this depiction, a virtual replica of the physical environment
and robotic components is created, enabling real-time monitoring, analysis, and simulation.
The digital twin facilitates enhanced human-robot interaction and system performance by
providing a virtual platform for testing and optimization. Through the digital twin, stake-
holders can simulate various scenarios, assess potential risks, and implement optimizations
before deployment in the physical environment. This visualization underscores the trans-
formative potential of digital twins in improving the efficiency, safety, and effectiveness of
HRC systems in diverse industrial applications.

6.2. Conception of the Human in Manufacturing


A "human" in the context of manufacturing refers to human workers, operators, and
professionals who are involved in various aspects of the manufacturing process. Humans
have traditionally played a central role in manufacturing, providing skills, expertise, decision-
making capabilities, and physical labor. Here’s how humans fit into current manufacturing
processes:
1. Skilled Labor: Humans bring a wide range of skills to manufacturing, including
craftsmanship, problem-solving abilities, creativity, and adaptability. Skilled human workers
are essential for tasks that require precision, attention to detail, and manual dexterity, such
as assembling complex products, performing quality inspections, and fine-tuning machinery.
[82][83]
2. Decision-Making: Humans are responsible for critical decision-making in manufac-
turing. They make decisions related to process optimization, production scheduling, quality
control, and resource allocation. Human expertise is crucial for addressing unforeseen chal-
lenges and adapting to changing production requirements.
3. Monitoring and Quality Control: Humans are involved in monitoring manu-
facturing processes and ensuring product quality. They use their sensory perception and
judgment to identify defects, inconsistencies, and deviations from quality standards. Hu-
man inspectors play a vital role in maintaining high product quality. [84]
4. Maintenance and Repairs: Manufacturing equipment and machinery require regu-
lar maintenance and occasional repairs. Skilled human technicians and maintenance person-
nel are responsible for diagnosing issues, conducting maintenance activities, and restoring
equipment to optimal operating conditions.

May 3, 2024
Figure 13: The comprehensive exploration of the forms and evolutionary phases of a digital twin, encom-
passing its inception in the design phase, its role during manufacturing, its dynamic engagement in the
operational phase, and its continued significance in maintenance and end-of-life considerations. Derived
from [84]

5. Adaptability: Humans are inherently adaptable and can quickly respond to unex-
pected events and changes in production requirements. They can shift roles, reconfigure
processes, and troubleshoot problems on the fly, making them valuable assets in dynamic
manufacturing environments. [85]
6. Collaboration: Humans collaborate with each other and with machines in man-
ufacturing settings. Human-machine collaboration (HMC) and Human-Robot Collabora-
tion (HRC) are emerging trends that leverage human capabilities alongside automation and
robotics. This collaborative approach aims to improve efficiency and safety in manufactur-
ing.
7. Problem Solving: Manufacturing often presents complex challenges that require
problem-solving skills. Humans analyze production issues, identify root causes, and imple-
ment solutions to enhance efficiency, reduce waste, and address bottlenecks.
8. Continuous Improvement: Humans are at the forefront of continuous improve-
ment efforts in manufacturing. They participate in lean manufacturing practices, Six Sigma
May 3, 2024
initiatives, and other methodologies to streamline processes, reduce defects, and optimize
resource utilization. [86]
9. Innovation: Human creativity and innovation drive the development of new man-
ufacturing technologies and processes. Engineers, designers, and researchers contribute to
the invention of advanced manufacturing techniques, materials, and products.
While automation and Industry 4.0 technologies have introduced greater levels of au-
tomation and digitalization into manufacturing, humans remain integral to the process.
The challenge lies in finding the right balance between automation and human involvement,
with a focus on tasks that leverage human strengths, while automating repetitive, haz-
ardous, or data-intensive tasks. As manufacturing continues to evolve, the role of humans
will adapt and expand, and there will be a growing emphasis on human-centric manufactur-
ing approaches in Industry 5.0, where humans and machines work in symbiosis to achieve
sustainable and efficient production systems. Using both economic and psychological con-
ceptions of humans, Bitsch defines seven different levels where humans can be integrated
into CPPS [20], summarised in Table 1. By considering these levels, it is expected that
human-centric manufacturing systems will better meet the requirements of human-centric
CPPSs. Figure 13 offers a comprehensive exploration of the forms and evolutionary phases
of a digital twin. It spans its inception in the design phase, role in manufacturing, engage-
ment in operations, and significance in maintenance and end-of-life considerations. This
visualization highlights the dynamic nature of digital twins throughout the product lifecy-
cle, underscoring their transformative impact on product development, manufacturing, and
lifecycle management.

6.3. Ergonomics
When discussing the role of humans in the manufacturing process, it is essential to
pay careful attention to the concept of ergonomics. Ergonomics is a multidisciplinary field
that focuses on designing and arranging the elements of a system to optimize human well-
being, performance, and overall interaction with their work environment. In the context
of manufacturing, it plays a critical role in ensuring that human workers can perform their
tasks efficiently, comfortably, and without risking their health. Economics has traditionally
been a driving force behind the design and organization of operator workstations within
the manufacturing industry. This is because optimizing the work environment for human
operators not only enhances their performance but also contributes to cost savings and
productivity improvements. Several key aspects of ergonomics come into play in this context:
1. Furniture Design: Elements such as the height of chairs, tables, and screens are
carefully considered to align with human physiology and anthropometric measurements. Er-
gonomically designed furniture ensures that workers can maintain a comfortable and healthy
posture while performing their tasks, reducing the risk of musculoskeletal disorders and dis-
comfort.
2. Efficiency and Productivity: Ergonomically designed workstations are not just
about comfort; they also lead to increased efficiency and productivity. When the workstation
layout and equipment are optimized for human use, workers can perform their tasks with

May 3, 2024
less fatigue and fewer errors. This, in turn, leads to higher quality output and reduced
downtime. [87][88]
3. Safety: Ensuring that the work environment aligns with ergonomic principles is
crucial for the safety of workers. Inappropriate workstation design can lead to accidents and
injuries. Proper ergonomics reduces the risk of accidents and improves the overall safety of
the manufacturing process.
4. Well-being: Ergonomics goes beyond physical comfort; it also considers the psy-
chological well-being of workers. A well-designed workstation can lead to increased job
satisfaction and reduced stress. This, in turn, can have a positive impact on the morale and
motivation of the workforce.
5. Adaptability: Ergonomics is not a one-size-fits-all concept. It recognizes the diver-
sity in human characteristics and ensures that workstations can be adapted to accommodate
workers with various needs, such as those with disabilities or differing physical characteris-
tics. When integrating humans into the manufacturing process, especially in the context of
"human in the loop," ergonomics becomes an integral part of the design process. It ensures
that the interaction between humans and machines is optimized for both efficiency and the
well-being of the workers. [89] In an era of increasing automation and human-robot collab-
oration, paying attention to ergonomics is not just a matter of comfort; it is a fundamental
component of creating a safe, productive, and sustainable manufacturing environment.

6.4. Human Robot Collaboration


HRC represents a significant paradigm shift in the world of manufacturing, aiming to
maximize the synergies between humans and robots. It capitalizes on the strengths of both
parties, blending the precision and accuracy of robotics with the cognition, adaptability, and
flexibility of human workers [21]. The evolution of HRC research has seen distinct phases,
progressing from HRI to HRC, and is currently venturing into the realms of symbiotic and
proactive robotics [17], [22]. Shi et al. [90] have provided a framework for understanding
the levels of human-robot collaboration, categorizing it into low, medium, and high collabo-
ration. In scenarios characterized by low collaboration, humans don’t interact directly with
the robot’s end effector, nor do they feed parts directly to the robot. Instead, they may load
parts into a fixture, which then moves them into the robot’s working zone for processing.
In medium interactions, humans can load parts directly onto the robot’s end effector, but
the robot remains de-energized, fully extended, and motionless within the human’s working
range. High collaboration signifies a state where one or more humans share the working range
of the robot. In this scenario, the robot’s motors are energized, allowing it to move while
humans are present within the range. Additionally, the robot’s movements may be adjusted
based on sensor inputs or real-time communication with the human operator. Central to
HRC is the concept of a shared, collaborative workspace, wherein humans and robots work
in tandem on simultaneous tasks, often sharing the same physical area while ensuring safety
through various mechanisms [91]. This collaborative approach has the potential to improve
manufacturing efficiency, enhance product quality, and create safer working environments.
One crucial aspect of ensuring safety in HRC is the concept of "safe zoning." Safe zon-
ing involves defining specific areas within the collaborative workspace where the robot and
May 3, 2024
Figure 14: Recording and capturing data from the collaborative robot (cobot) as it operates within the
manufacturing environment. This data logging process includes collecting information related to the cobot’s
movements, interactions, performance metrics, and any relevant operational parameters. Derived from [91]

human can operate without endangering each other. Advanced sensors and control systems
are employed to monitor these zones, ensuring that the robot’s movements are controlled to
prevent collisions and maintain the safety of human operators. Moreover, gesture control is
emerging as a promising method for facilitating human-robot collaboration. By using intu-
itive gestures and commands, humans can interact with robots and convey their intentions
effectively. This technology enables seamless communication and coordination between hu-
mans and robots in shared workspaces, further enhancing the collaborative nature of HRC.
As HRC continues to evolve, researchers and manufacturers are exploring innovative ways to
optimize the interaction between humans and robots, aiming to unlock the full potential of
this collaborative approach for advanced manufacturing processes. Figure 14 showcases the
process of recording and capturing data from a collaborative robot (cobot) operating within
a manufacturing environment. Sensors embedded in the cobot monitor its movements, inter-
actions, and performance metrics in real-time. This data logging process provides valuable
insights into the cobot’s behavior and performance, facilitating analysis, optimization, and
troubleshooting. By visualizing this process, stakeholders can enhance cobot functionality
and productivity in manufacturing operations.

6.5. Human Robot Safety


Ensuring the safety of humans working alongside robots in collaborative environments is
paramount, and several international standards have been established to address these safety
concerns. ISO 10218-1 [24] lays out safety requirements for robots and robotic devices, while
May 3, 2024
ISO 10218-2 [25], both published in 2011, focus on safety requirements for robot systems and
integration. These standards provide a comprehensive framework for safeguarding humans
during robot operations. In 2016, ISO/TS 15066:2016 [26] was introduced to offer guidance
specifically for human-robot collaboration within shared working spaces. ISO 10218 sets
forth four essential requirements for a robot to be considered safe for human collaboration:
1. Safety Rated Monitored Stop: The robot must be equipped with a system that
allows it to safely come to a halt when necessary, ensuring that it can stop immediately if a
human enters its workspace.
2. Hand Guiding: This feature permits direct manual control of the robot’s movements,
allowing a human operator to guide and manipulate the robot’s actions, ensuring precise
and safe collaboration.
3. Speed and Separation Monitoring: The robot should possess monitoring capa-
bilities that enable it to detect the presence of humans within its workspace. It can then
adjust its speed and maintain a safe separation distance accordingly.
4. Power and Force Limiting by Inherent Design or Control: The robot’s design
or control system should inherently limit its power and force output, preventing it from
exerting excessive force that could pose a danger to humans.
Human Robot Safety for Industry 5.0" is a concept that emphasizes the importance of
safety in advanced manufacturing environments where humans work alongside robots in a
collaborative and interconnected manner. Industry 5.0 represents the latest phase of in-
dustrial evolution, building upon the foundation of Industry 4.0, which is characterized by
increased automation, data exchange, and digitalization. Human-centered manufacturing
based on digital twin refers to the integration of human workers and robots within a manu-
facturing environment, where the digital twin concept plays a significant role. A digital twin
is a virtual representation of a physical object or system, which can include machines, prod-
ucts, and even the entire manufacturing facility. The digital twin concept enables real-time
monitoring, simulation, and analysis of physical systems, providing insights for optimization
and decision-making such as:
1.Safety First: The primary focus of this concept is ensuring the safety of human work-
ers who collaborate with robots in manufacturing processes. As automation and digitaliza-
tion increase, humans are working in closer proximity to robots, making safety measures
paramount. [92]
2. Collaborative Robots (Cobots): Industry 5.0 emphasizes the use of collaborative
robots (cobots) that can work side by side with human workers. These robots are designed
to be safe, flexible, and easily programmable. They can take on tasks that are repetitive,
dangerous, or physically strenuous, allowing human workers to focus on more complex and
creative aspects of their jobs.
3. Risk Assessment: To achieve safety, a thorough risk assessment is essential. Man-
ufacturers need to evaluate the potential hazards associated with human-robot interactions
and implement measures to mitigate these risks.
4. Real-time Monitoring: The use of digital twins allows for real-time monitoring
of the manufacturing process. Safety can be enhanced by using sensors and analytics to
detect potential issues or deviations from the norm, which can trigger automatic shutdowns
May 3, 2024
or safety measures to protect human workers. [93]
5. Simulation and Training: Digital twins can also be used for simulating various
scenarios and training human workers and robots. This training can help them better
understand how to work together safely and effectively.
6. Continuous Improvement: Safety is not a one-time concern but a continuous pro-
cess. With the data and insights provided by digital twins, manufacturers can continuously
improve safety protocols and make adjustments to keep up with changing manufacturing
conditions.
7. Regulatory Compliance: Industry 5.0 and human-robot collaboration bring new
regulatory challenges. Compliance with safety standards and regulations is essential to
ensure that manufacturing practices are in line with legal and ethical requirements.
It’s important to note that while a robot is a crucial component of a collaborative robotic
system, safety also depends on other factors, such as the end effector or end-of-arm tooling.
Hazards posed by sharp components or extreme temperatures, for instance, must be carefully
considered during the design of a collaborative robotic system. Most safety concerns in
Human-Robot Collaboration (HRC) revolve around the shared workspace, emphasizing the
importance of proper risk assessment and mitigation measures [27]. In shared workspaces,
human presence is expected, and safety standards dictate the need to prevent or detect
humans entering safeguarded spaces beyond the collaborative area. Several key themes are
relevant in addressing safety and efficiency concerns in HRC:
1. Obstacle Detection, Avoidance, and Trajectory Planning: To ensure safety
and prevent collisions, robotic systems utilize obstacle detection techniques such as LiDAR,
RADAR, ultrasonic sensors, or infrared motion sensors to detect obstacles in the workspace.
Trajectory planning algorithms adjust the robot’s path in real-time to avoid obstacles and
maintain a safe distance from humans or other objects [28].
2. Safety vs. Downtime: Dynamic safety zones enable robots to adapt their speed
based on the proximity of humans, minimizing downtime. This approach enhances both
safety and overall robot efficiency, as it allows the robot to operate at higher speeds when
humans are at a safe distance and slow down when they approach, without compromising
productivity.

7. Methods of detecting humans on the manufacturing floor


Detecting humans in manufacturing environments has been a long-standing challenge,
garnering significant attention from researchers, particularly within the computer vision com-
munities [12]. In such contexts, the robustness of human detection methods is paramount.
Industrial robots pose inherent risks to humans, necessitating the development of highly
reliable human detection systems for safety applications. For instance, in scenarios where
a robot’s workspace is being monitored, the detection system must demonstrate its abil-
ity to identify all humans present in the workspace promptly. This information should be
communicated to the robot controller without delay [12]. According to recommendations
by Munaro et al. [94], the controller’s update rate should ideally reach up to 30 Hz, with
a latency of less than 0.2 seconds. These stringent requirements ensure that the system
May 3, 2024
can swiftly respond to the presence of humans and initiate the necessary safety measures
to prevent accidents or collisions. The reliability and speed of human detection systems
are essential components of creating a safe and efficient collaborative environment between
humans and robots in manufacturing settings.

7.1. 2D Cameras (RGB Cameras)


Detecting humans in manufacturing environments has been a long-standing challenge,
garnering significant attention from researchers, particularly within the computer vision com-
munities [12]. In such contexts, the robustness of human detection methods is paramount.
Industrial robots pose inherent risks to humans, necessitating the development of highly
reliable human detection systems for safety applications. For instance, in scenarios where
a robot’s workspace is being monitored, the detection system must demonstrate its abil-
ity to identify all humans present in the workspace promptly. This information should be
communicated to the robot controller without delay [12]. According to recommendations
by Munaro et al. [95], the controller’s update rate should ideally reach up to 30 Hz, with
a latency of less than 0.2 seconds. These stringent requirements ensure that the system
can swiftly respond to the presence of humans and initiate the necessary safety measures
to prevent accidents or collisions. The reliability and speed of human detection systems
are essential components of creating a safe and efficient collaborative environment between
humans and robots in manufacturing settings.

7.2. Stereovision Cameras


Stereovision cameras are a type of sensor equipped with two camera sensors strategically
spaced apart. The primary function of these sensors is to perform image comparison between
the left and right sensors, enabling them to calculate the depth of the scene. However, it’s
important to note that this depth calculation process is computationally intensive and can
pose challenges in terms of processing power and efficiency [12].

7.3. 3D Time of Flight


3D Time-of-Flight (ToF) is a technology used for depth sensing and distance measure-
ment. It operates on the principle of measuring the time it takes for light or infrared signals
to travel from a source to an object and back to a sensor. Here’s a basic introduction to the
key concepts of 3D Time-of-Flight:
1. Principle of Operation: ToF sensors emit light pulses or infrared signals toward an
object. These signals bounce off the object’s surface and return to the sensor. By measuring
the time it takes for the signals to travel to the object and back, the sensor can calculate
the distance to the object. This distance information is used to create a 3D depth map of
the scene.
2. Laser or Infrared Light: ToF sensors typically use laser diodes or infrared LEDs
to generate the light signals. These sources emit light in the form of pulses, and the sensor
records the time it takes for each pulse to return.

May 3, 2024
3. Time Measurement: To accurately measure the time-of-flight, the sensor uses a
high-speed electronic shutter. This shutter opens and closes rapidly, allowing the sensor to
precisely record the time delay between the emitted signal and its return. [96]
4. Depth Calculation: By knowing the speed of light (approximately 299,792,458
meters per second) and the time it took for the signal to bounce back, the sensor calculates
the distance to the object. This process is repeated many times, creating a 3D point cloud
or depth map of the objects in the sensor’s field of view.
5. Applications: 3D Time-of-Flight technology has a wide range of applications. It is
commonly used in robotics for obstacle avoidance and navigation, in automotive for driver
assistance systems, and in consumer electronics for features like facial recognition and gesture
control. It’s also valuable in industrial settings for quality control and object detection.
6. Advantages: ToF sensors offer several advantages, including fast and real-time depth
sensing, accuracy, and the ability to work in various lighting conditions. They are compact
and suitable for integration into various devices. [97]
7. Challenges: While ToF technology is powerful, it may have limitations in terms of
long-range sensing and accuracy in certain conditions, such as extremely bright sunlight.
Additionally, the resolution of ToF sensors may vary, impacting the level of detail in the
generated 3D maps.

7.4. RGB-D Camera


RGB-D cameras, also known as depth cameras or 3D cameras, are a type of sensor that
combines traditional RGB (color) imaging with depth perception capabilities. They provide
both color information and depth information in a single device. Here’s an introduction to
what RGB-D cameras are and how they work:
1. RGB and Depth Information: RGB-D cameras capture two essential types of
information simultaneously. These cameras capture traditional color images, just like a
standard digital camera. RGB data consists of red, green, and blue color channels, and it
provides information about the visual appearance of objects in the scene. In addition to color
information, RGB-D cameras capture depth data. Depth data measures the distance from
the camera sensor to objects in the scene, providing a 3D representation of the environment.
[98]
2. Depth Sensing Technologies: RGB-D cameras use various technologies to measure
depth. The most common methods include:
- Time-of-Flight (ToF): Similar to 3D Time-of-Flight sensors, ToF RGB-D cameras emit
light signals (typically infrared) and measure the time it takes for the signals to bounce off
objects and return to the sensor. This time delay is used to calculate depth.
- Structured Light: Some RGB-D cameras project a pattern of structured light onto the
scene. By analyzing how this pattern deforms when it interacts with objects, the camera
calculates depth information.
- Stereo Vision: RGB-D cameras with stereo vision use two camera lenses to capture images
from slightly different viewpoints. By comparing these images, the camera can triangulate
depth information.

May 3, 2024
3. Applications: RGB-D cameras have a wide range of applications across industries,
including:
- Computer Vision: They are used in computer vision tasks like object recognition, track-
ing, and scene understanding.
- Robotics: RGB-D cameras help robots navigate and interact with their surroundings,
avoiding obstacles and grasping objects.
- Gaming: Many gaming consoles and devices use RGB-D cameras for motion sensing,
gesture control, and augmented reality experiences. [99]
- Augmented Reality (AR) and Virtual Reality (VR): AR and VR applications ben-
efit from RGB-D cameras by providing realistic 3D environments and interactions.
- Industrial Automation: RGB-D cameras are used for quality control, object detection,
and assembly line automation.
RGB-D cameras offer several advantages, including real-time depth sensing, versatility in
various lighting conditions, and the ability to capture rich 3D data. They are also relatively
compact and can be integrated into a wide range of devices.
5. Challenges: While RGB-D cameras are powerful, they may have limitations in
terms of range and accuracy, especially in outdoor or bright environments. Additionally,
the quality of depth data can vary among different camera models. In summary, RGB-D
cameras are sensor devices that capture both color (RGB) and depth information from the
surrounding environment. They are valuable tools for a broad spectrum of applications,
from computer vision and robotics to gaming and augmented reality, enabling enhanced
perception and interaction with the physical world.

7.5. LiDAR
LiDAR (Light Detection and Ranging) is an invaluable tool for object detection and
environmental mapping within the manufacturing context. LiDAR sensors operate by emit-
ting laser pulses and measuring the time it takes for these pulses to bounce off surfaces and
return to the sensor. This information is then used to generate detailed point clouds in both
2D and 3D formats, where each point corresponds to a surface that the laser has interacted
with. In manufacturing environments, LiDAR serves various critical purposes:
1. Obstacle Detection and Collision Avoidance: LiDAR sensors are adept at detect-
ing objects, machinery, and potential obstacles in real-time. They are frequently employed
on mobile robotic platforms, ensuring safe navigation and proactive collision avoidance as
robots move through dynamic manufacturing spaces.
2. Simultaneous Localization and Mapping (SLAM): LiDAR plays a pivotal role
in SLAM, a fundamental technique for mobile robotic platforms. By continuously measuring
distances to surrounding objects and surfaces, LiDAR sensors help robots simultaneously
create detailed maps of their environment and determine their precise location within those
maps. This capability is indispensable for autonomous robotic systems, enabling them to
navigate and perform tasks with accuracy. [100][101]
3. Quality Control and Inspection: LiDAR’s ability to capture precise 3D data
makes it invaluable for quality control and inspection processes within manufacturing. It

May 3, 2024
can be used to assess the dimensions, shapes, and surface conditions of products, ensuring
they meet stringent quality standards.
4. Real-time Monitoring: LiDAR sensors can provide real-time monitoring of manu-
facturing processes. By continuously scanning the environment, they enable quick detection
of any deviations or anomalies, allowing for immediate corrective actions and enhancing
process efficiency.

7.6. Radar
RADAR (Radio Detection and Ranging) is another valuable sensing technology used in
manufacturing environments for various applications. Unlike LiDAR, which relies on laser
pulses, RADAR uses radio waves to detect and locate objects. Here’s some key information
about RADAR and its applications in manufacturing:
1. Working Principle: RADAR systems emit radio waves in the form of electromag-
netic signals. These signals travel through the environment until they encounter an object.
When radio waves strike an object, some of the energy is reflected back to the RADAR re-
ceiver. By analyzing the time it takes for the signal to return and its Doppler shift (change in
frequency due to the object’s motion), RADAR systems can determine the distance, speed,
and direction of objects in their vicinity.
2. Object Detection and Tracking: In manufacturing, RADAR is often used for
object detection and tracking. It can identify moving machinery, vehicles, or personnel
within a designated area. This information is crucial for ensuring the safety of workers and
coordinating the movements of autonomous mobile robots and vehicles. [102]
3. Level Sensing: RADAR sensors can be employed for level sensing applications, such
as measuring the fill level of containers or tanks containing liquids or granular materials. This
is important in industries like chemical processing, food and beverage, and pharmaceuticals,
where precise inventory management is essential.
4. Material Characterization: RADAR technology can be used to analyze the com-
position and characteristics of materials, especially in non-destructive testing (NDT) ap-
plications. It can detect defects, voids, or inconsistencies in materials such as composites,
concrete, or metal components, helping to ensure product quality and structural integrity.
5. Environmental Monitoring: RADAR sensors can be used for monitoring environ-
mental conditions, such as measuring air humidity, rainfall, or snowfall. This data can be
valuable in optimizing manufacturing processes and maintaining safe working conditions.
6. Security and Surveillance: RADAR is employed in security systems to detect
and track intruders or unauthorized personnel in manufacturing facilities. It provides an
additional layer of security, especially in large industrial complexes.
7.Navigation and Collision Avoidance: Autonomous vehicles and robots in manu-
facturing benefit from RADAR technology for navigation and collision avoidance. It helps
them detect and respond to obstacles or other moving objects in real-time, ensuring safe
and efficient operation. Overall, RADAR technology adds another dimension to sensing
and monitoring capabilities in manufacturing, offering reliable object detection, tracking,
and environmental data collection that can enhance safety, efficiency, and quality control in
various industrial settings.
May 3, 2024
7.7. Smart Tiles
Smart Tiles is a term commonly used to refer to a type of self-adhesive wall tile designed
for easy and quick installation in various settings, such as kitchens and bathrooms. These
tiles are often used as an alternative to traditional ceramic or glass tiles because they are
generally more affordable, lightweight, and simpler to install. Here’s some information on
smart tiles:
1. Material: Smart tiles are typically made from a variety of materials, including
composite materials, gel components, and high-quality adhesive layers. These materials are
chosen for their durability and ability to withstand moisture and heat, making them suitable
for kitchen and bathroom applications.
2. Appearance: Smart tiles come in a wide range of styles, colors, and patterns,
mimicking the look of traditional tiles. They can resemble subway tiles, mosaic patterns,
or decorative designs. This variety allows homeowners and designers to choose a style that
complements their decor. [103]
3. Installation: One of the key advantages of smart tiles is their ease of installation.
They are self-adhesive, meaning they can be applied directly to a clean, flat surface without
the need for grout or special tools. This DIY-friendly installation process can save both
time and money compared to traditional tile installation.
4. Maintenance: Smart tiles are relatively low-maintenance. They are resistant to
water and heat, which makes them suitable for kitchen backsplashes and shower walls. To
clean them, simply wipe with a damp cloth or a mild cleaning solution.
5. Durability: While smart tiles are durable and long-lasting, they may not be as
resilient as traditional ceramic or glass tiles. They are best suited for areas that don’t
experience heavy wear and tear, such as kitchen backsplashes and bathroom walls. They
may not be recommended for high-traffic flooring applications.
6. Cost: Smart tiles are generally more cost-effective than traditional tiles. They are
a budget-friendly option for those looking to update the appearance of a room without a
significant investment in materials and labor.
7. Removal: If you decide to change your decor or replace the tiles, smart tiles are
usually removable without causing damage to the wall surface. This is another advantage
for DIY enthusiasts.
8. Availability: Smart tiles are readily available in home improvement stores, and there
are also many online retailers that offer a wide selection of styles and colors.

7.8. Wearables
Wearables have found a significant role in enhancing safety across various applications,
leveraging their IoT connectivity and sensor capabilities. These devices are designed to
proactively monitor, detect, and respond to safety-related concerns in real-time. Some key
safety applications of wearables include:
1. Fall Detection: Wearables equipped with accelerometers and gyroscopes can ac-
curately detect sudden falls or impacts. These devices are particularly valuable for elderly
individuals or those at risk of accidents. When a fall is detected, the wearable can send alerts
or notifications to designated contacts or emergency services, ensuring prompt assistance.
May 3, 2024
2. Drowsiness Detection: Some wearables, especially those used in automotive and
industrial settings, incorporate sensors that monitor the wearer’s level of alertness and signs
of drowsiness. These devices can provide timely warnings to prevent accidents caused by
driver fatigue or inattentiveness. [104]
3. Environmental Monitoring: Wearables with environmental sensors can track fac-
tors like air quality, temperature, humidity, and even radiation levels. This data can be
invaluable for workers in hazardous environments, helping them make informed decisions to
protect their health and safety.
4. Emergency Alerts: Wearables can be programmed to trigger emergency alerts in
critical situations. For example, if a wearer is in distress or encounters a dangerous situation,
they can activate an SOS signal that notifies emergency responders or designated contacts
with their precise location.
5. Occupational Safety: In industrial and construction settings, wearables can monitor
workers’ vital signs, ambient conditions, and exposure to hazards. This data can be used
to assess and improve workplace safety, ensuring compliance with safety regulations and
reducing the risk of accidents.
6. Personal Security: Wearables can serve as personal security devices, offering fea-
tures like panic buttons or discreet distress signals. These devices can help individuals,
especially those in vulnerable situations, quickly summon assistance when they feel threat-
ened.
7. Location Tracking: GPS and location tracking sensors in wearables enable real-
time location monitoring. This is particularly valuable for activities like hiking, camping, or
adventure sports, ensuring that users can be located in case of emergencies or getting lost.
[105]
8. Medical Alerts: Some wearables are designed to monitor specific health conditions,
such as heart rate irregularities or glucose levels. In case of health emergencies, these devices
can send alerts to medical professionals or caregivers, enabling swift intervention.
9. Child Safety: Wearable devices for children often include features like geofencing,
allowing parents to define safe boundaries for their kids. If a child crosses these boundaries,
parents receive notifications, enhancing child safety.
10. Occupational Health and Safety (OHS): In the workplace, wearables can be
integrated into OHS programs to monitor and improve employee safety, minimize accidents,
and enhance overall well-being.

7.8.1. Headset wearable devices


Wearables are electronic devices designed to be worn by humans in manufacturing and
other industrial settings. These devices serve various functions, such as monitoring human
motion and capturing signals from the body. These signals are valuable for assessing the
well-being of workers and can even be used to interpret brain signals, opening up innovative
applications in manufacturing automation and safety.
1. Motion Tracking: Wearables equipped with accelerometers, gyroscopes, and other
motion sensors can precisely monitor and analyze the movements of workers on the man-
ufacturing floor. This data can be used for ergonomics analysis, assessing worker posture,
May 3, 2024
and identifying potential safety hazards.
2. Body Signals: Wearable devices can also capture physiological signals from the body,
such as heart rate, skin temperature, and muscle activity. These metrics can provide insights
into the physical condition and stress levels of workers, aiding in fatigue management and
well-being monitoring. [106]
3. Brain-Machine Interfaces (BMIs):
a. fNIRS (Functional Near-Infrared Spectroscopy): This innovative technology in-
volves using a wireless headset with fNIRS sensors to capture brain activity. It has the po-
tential to enable direct human-machine interaction on the manufacturing floor. For instance,
fNIRS can be employed to control industrial robot arms through brain signals, allowing for
more intuitive and precise control.
b. EEG (Electroencephalography): EEG sensing headsets are used to record brain
activity and have applications in brain robotics. In this context, EEG data can be ana-
lyzed to detect specific brainwave patterns associated with certain commands or intentions.
This information can then trigger predefined function blocks that serve as robot control
commands. Brain-controlled robots have the potential to enhance automation in manufac-
turing processes, making them more responsive to human intentions and reducing the need
for manual programming. These wearable technologies not only contribute to improved
safety and well-being in manufacturing but also hold the promise of increasing efficiency
and flexibility in industrial automation. [107] By seamlessly integrating human-machine in-
teraction through brain signals and motion monitoring, wearables are advancing the field of
Industry 4.0, where smart manufacturing systems leverage data and technology to optimize
production processes and worker experiences. As these technologies continue to evolve, they
are likely to play a crucial role in shaping the future of manufacturing and human-robot
collaboration.

7.9. Combined Sensing Systems


In recent years, there has been a growing interest in combining multiple sensing sys-
tems to enhance tracking and monitoring capabilities, particularly in applications involving
people. One notable example comes from a study by reference [11], where wearables and
3D vision systems were integrated to track individuals in various contexts. This innovative
approach leverages the strengths of both wearable technology and advanced computer vision
systems for a more comprehensive understanding of human behavior and movement.
1. Wearable Technology: Wearables, such as smartwatches, fitness trackers, and
even specialized sensors, are worn by individuals to capture various types of data. These
wearables can monitor vital signs, movement patterns, and physiological parameters. For
instance, heart rate, body temperature, and motion data can be collected in real-time,
providing insights into a person’s physical condition and activity levels.
2. 3D Vision Systems: 3D vision systems, also known as depth-sensing or depth-
perception systems, utilize technologies like LiDAR (Light Detection and Ranging) or stereo
cameras to create detailed 3D representations of the environment. These systems excel at
detecting and tracking objects, including people, with high precision and accuracy. They
can capture spatial information, recognize shapes, and estimate distances. The integration
May 3, 2024
of wearables and 3D vision systems offers several advantages and applications. By combin-
ing wearable data with 3D vision data, it becomes possible to gain a more comprehensive
understanding of a person’s actions and interactions within a specific environment. This can
be valuable for tracking individuals in complex or dynamic settings. Wearables can continu-
ously monitor health-related data, while 3D vision systems can track a person’s movements
and behaviors. This combination can be used for early detection of health issues or changes
in physical activity that may indicate a health concern. In security applications, combining
wearable and 3D vision data can enhance surveillance and access control. Suspicious or
unusual behavior can be identified by analyzing both physiological and spatial data. The
integration of wearables and 3D vision systems can also enable advanced human-computer
interaction. [108] For example, a person’s gestures and movements tracked by 3D vision
can be used to control applications or devices, while wearables provide contextual infor-
mation. In research or industrial settings, combining these sensing systems can provide a
more comprehensive view of how individuals interact with their surroundings. This can aid
in optimizing workspaces, workflows, and safety protocols. The combination of wearables
and 3D vision systems exemplifies the potential of merging multiple sensing modalities to
provide a richer and more nuanced understanding of human behavior and the environment.
This interdisciplinary approach has applications across various domains, including health-
care, safety and security, human-computer interaction, and environmental monitoring. As
technology continues to advance, we can expect further innovations in the field of combined
sensing systems.

7.10. Occupancy Maps


Occupancy maps are a common representation used in robotics, computer vision, and
autonomous systems to model and understand the occupancy status of space within a given
environment. They provide a way to represent and reason about the presence or absence of
objects or obstacles in a given area. An occupancy map is a 2D or 3D grid-based representa-
tion of a physical environment. Each cell in the grid represents a small region of space, and
each cell is associated with a probability value that indicates the likelihood of that region
being occupied by an obstacle or object. Occupancy maps can be binary or probabilistic.
In binary maps, cells are typically assigned values of 0 or 1, where 0 represents unoccupied
space and 1 represents occupied space. In probabilistic maps, each cell contains a continu-
ous value between 0 and 1, representing the probability of occupancy. Occupancy maps are
often generated by integrating sensor data, such as LIDAR, sonar, or camera measurements.
By processing sensor data, the map is updated over time to reflect the evolving percep-
tion of the environment. Occupancy mapping often uses a Bayesian framework to update
the probability values in the map. Bayes’ theorem is employed to combine prior beliefs
about occupancy with new sensor measurements to estimate the posterior probability of
occupancy for each cell. The resolution of the occupancy map grid can vary depending on
the application. High-resolution maps provide fine-grained details but require more memory
and computation, while low-resolution maps are more coarse but are computationally more
efficient. [109]

May 3, 2024
Autonomous robots use occupancy maps for path planning, obstacle avoidance, and nav-
igation. AR applications use occupancy maps to understand and interact with the real-world
environment. Occupancy maps are often used in simulations for testing and validation of au-
tonomous systems. Intrusion detection systems use occupancy maps to monitor and detect
unauthorized access. Sensor measurements can be noisy, leading to uncertainty in occu-
pancy estimates. Continuous updates are required to maintain an accurate representation
of the environment as it changes. High-resolution maps can be computationally intensive
and may require substantial memory. Occupancy maps are often visualized as grayscale or
color maps, where lighter shades represent lower occupancy probabilities (unoccupied space)
and darker shades represent higher occupancy probabilities (occupied space). Occupancy
maps play a crucial role in the perception and decision-making processes of autonomous sys-
tems, allowing them to operate safely and effectively in complex and dynamic environments.
They are a fundamental component in the field of robotics and computer vision, enabling
machines to understand and navigate the world around them.

8. Discussion
Space-temporal data, also known as spatiotemporal data, is a fascinating and crucial
type of information that plays a significant role in various fields, from scientific research to
practical applications in everyday life. This type of data represents how different variables
or attributes change over both space and time. Let’s delve into a discussion about space-
temporal data, its characteristics, and its significance:
1. Characteristics of Space-Temporal Data:
- Spatial Dimension: Space-temporal data includes information related to different loca-
tions or spatial points. This spatial dimension can be represented in two or three dimensions,
depending on the application. The data also incorporates the aspect of time, indicating how
the attributes or variables change over time. This can range from seconds to years, depending
on the domain.
2. Applications in Various Fields:
- Environmental Monitoring: Spatiotemporal data is crucial in monitoring and managing
environmental phenomena such as climate change, air quality, and natural disasters. For
example, it helps scientists study the movement of weather patterns and track changes in
temperature and precipitation over time and across different regions.
- Transportation and Urban Planning: In urban areas, this data is vital for traffic
management, public transportation optimization, and urban planning. It can help optimize
routes, reduce congestion, and improve the overall transportation system. In healthcare,
spatiotemporal data is used for disease tracking, patient monitoring, and resource allocation.
[110] For instance, it aids in tracking the spread of diseases like COVID-19 and helps hospitals
allocate resources efficiently during emergencies. Farmers use spatiotemporal data to make
informed decisions about crop planting, irrigation, and pest control. It enables precision
agriculture by tailoring actions to specific spatial and temporal conditions.Governments and
organizations use spatiotemporal data to manage and conserve natural resources, such as

May 3, 2024
forests, water bodies, and wildlife. It aids in tracking deforestation, water quality changes,
and wildlife migration patterns.
3. Challenges in Analyzing Space-Temporal Data:
- Data Volume: Space-temporal data can be massive due to the continuous collection
of information over large geographical areas and extended periods. Handling and process-
ing such large volumes of data require robust infrastructure and algorithms. Ensuring the
quality and accuracy of spatiotemporal data can be challenging. Errors in data collection,
sensor malfunctions, and missing values can affect the reliability of analyses. Combining
different sources of space-temporal data, such as satellite imagery, weather stations, and
IoT devices, often requires complex data integration techniques to create a comprehensive
dataset. Analyzing space-temporal data often involves advanced statistical and machine
learning techniques. Researchers and analysts need to account for the spatial and temporal
dependencies in the data.
4. Future Trends:
- AI and Machine Learning: Advances in artificial intelligence and machine learning
have opened up new possibilities for analyzing and predicting spatiotemporal phenomena.
Predictive modeling and anomaly detection are becoming increasingly important. The pro-
liferation of IoT devices and sensor networks is generating vast amounts of real-time space-
temporal data. This trend is likely to continue, enabling more precise monitoring and
decision-making. Visualizing space-temporal data is essential for conveying insights to a
broader audience. Interactive and immersive data visualization tools are emerging to help
individuals and organizations better understand complex spatiotemporal patterns. Using
advanced sensing technologies like 3D RGB-D cameras or convolutional neural networks
(CNNs) with 2D RGB cameras and libraries like OpenPose, it is possible to detect and
record human skeleton data. [111] These technologies enable the calculation of the human
center line coordinate (HCLC), which is a crucial component in human pose estimation. One
promising application of such technology is in the detection of falls, especially in industrial
settings like manufacturing floors. Falls and other occupational incidents pose significant
risks in these environments, and having a robust human detection system that can record
human poses offers several opportunities:
1. Fall Detection: By continuously monitoring the poses of workers on the manufac-
turing floor, the system can identify sudden and abnormal changes in body positions that
indicate a fall. This includes detecting when a worker collapses or loses balance.
2. Collision Detection: In addition to falls, the system can also be configured to
detect collisions or other sudden, unexpected movements. This can help prevent accidents
and injuries resulting from machinery or equipment collisions.
3. Immediate Alarm Trigger: When a fall or collision is detected, the system can
trigger an immediate alarm. This alarm can be both audible and visible, alerting nearby
workers and supervisors to the incident.
4. Faster Assistance: Early detection of falls or incidents is critical for providing
prompt medical assistance. For example, if a worker falls and injures themselves or expe-
riences a medical emergency like a heart attack, the system’s alarm can notify responders
quickly, potentially saving lives or minimizing the severity of injuries.
May 3, 2024
5. Upper Body Coordination: To enhance the accuracy of fall detection, it’s impor-
tant to monitor the upper body coordinates with respect to the HCLC. In the event of a
fall, the upper body’s position will deviate significantly from the norm, allowing for more
precise detection.
6. Data Logging and Analysis: Beyond immediate alerts, the system can log data
for later analysis. This data can be invaluable for investigating incidents, understanding
the causes of falls or collisions, and implementing preventive measures to enhance workplace
safety.
7. Continuous Monitoring: The system can provide 24/7 monitoring, ensuring that
incidents are detected even during non-working hours or when workers are alone on the
manufacturing floor.
However, while the potential benefits of such a system are significant, there are also
important considerations, such as privacy concerns, data security, and the need for worker
consent. It’s crucial to implement these technologies responsibly and in compliance with
relevant regulations and ethical guidelines. Additionally, the system should be designed to
minimize false alarms and be sensitive to different working conditions and environments to
ensure its effectiveness in enhancing workplace safety.

9. Conclusion
The study underscores the critical role of human detection technology in industrial man-
ufacturing, particularly in the context of Human Robot Collaboration (HRC). It serves as
a linchpin for enhancing safety, efficiency, and compliance within modern industrial prac-
tices. Through a thorough examination of the current state of this technology, the following
conclusions can be drawn:
1. Pivotal for Safety and Efficiency: Human detection technology is not only crucial
for ensuring the safety of workers but also for optimizing industrial efficiency. By preventing
accidents and enhancing the interaction between humans and robots, it supports cost savings
and technological progress.
2. Existing Shortcomings: Despite its significance, the technology faces several chal-
lenges. Issues related to accuracy and reliability, the occurrence of false alarms, adaptability
in dynamic environments, and the cost of implementation are current shortcomings that
need to be addressed.
3. Integration in Industry 4.0: The study highlights the importance of integrating
human operators into the Industry 4.0 framework, where advanced automated manufacturing
systems redefine the roles of humans and robots. This integration presents new opportunities
and challenges that require careful examination. [112]
4. Emphasis on 3D Spatial Awareness: In the context of Industry 4.0, 3D spatial
awareness emerges as a pivotal factor for enabling efficient and secure collaboration between
humans and robots within shared workspaces. This underscores the need for advanced sensor
technologies and data processing.
5. Comprehensive Analysis: The paper offers a comprehensive analysis of HRC and

May 3, 2024
Interaction HRI, shedding light on the complexities and nuances of human-robot collabora-
tion within modern manufacturing settings.
6. Technological Exploration: The study extensively explores various technologies
and methodologies for human detection, encompassing both established approaches and
emerging techniques. This in-depth examination reveals the evolving landscape of human
detection technology.
7. Challenges and Opportunities: The multifaceted challenges faced in integrating
humans into industrial settings are balanced with the exciting opportunities for improving
safety and efficiency. This interplay of challenges and opportunities requires ongoing research
and innovation.
8. Reshaping the Future: The paper emphasizes the potential of emerging technolo-
gies to reshape the landscape of human integration in advanced robotics. This suggests that
as technology advances, we can anticipate safer and more efficient manufacturing environ-
ments, ultimately redefining the future of industrial practices.
In conclusion, this paper underscores the significance of human detection technology and
human-robot collaboration within the Industry 4.0 paradigm. It recognizes the challenges
while highlighting the promising avenues for research and application. The future holds the
potential for a transformative impact on industrial manufacturing, making it safer, more
efficient, and adaptable to evolving needs.

10. Conflict of Interest


The authors declare that they have no conflict of interest.

11. References
1. Dalenogare, L.S., Benitez, G.B., Ayala, N.F. and Frank, A.G., 2018. The expected
contribution of Industry 4.0 technologies for industrial performance. International
Journal of production economics, 204, pp.383-394.
2. Javaid, M., Haleem, A., Singh, R.P. and Suman, R., 2023. An integrated outlook of
Cyber–Physical Systems for Industry 4.0: Topical practices, architecture, and appli-
cations. Green Technologies and Sustainability, 1(1), p.100001.
3. Nahavandi, S., 2019. Industry 5.0—A human-centric solution. Sustainability, 11(16),
p.4371.
4. Monostori, L., Kádár, B., Bauernhansl, T., Kondoh, S., Kumara, S., Reinhart, G.,
Sauer, O., Schuh, G., Sihn, W. and Ueda, K., 2016. Cyber-physical systems in manu-
facturing. Cirp Annals, 65(2), pp.621-641.
5. Singh, H., 2021. Big data, industry 4.0 and cyber-physical systems integration: A
smart industry context. Materials Today: Proceedings, 46, pp.157-162.
6. Ibarra, D., Ganzarain, J. and Igartua, J.I., 2018. Business model innovation through
Industry 4.0: A review. Procedia manufacturing, 22, pp.4-10.
7. Javaid, M., Haleem, A., Singh, R.P. and Suman, R., 2021. Substantial capabilities of
robotics in enhancing industry 4.0 implementation. Cognitive Robotics, 1, pp.58-75.
May 3, 2024
8. Dixon, J., Hong, B. and Wu, L., 2021. The robot revolution: Managerial and employ-
ment consequences for firms. Management Science, 67(9), pp.5586-5605.
9. Teiwes, J., Bänziger, T., Kunz, A. and Wegener, K., 2016, September. Identifying
the potential of human-robot collaboration in automotive assembly lines using a stan-
dardised work description. In 2016 22nd International Conference on Automation and
Computing (ICAC) (pp. 78-83). IEEE.
10. Gualtieri, L., Rauch, E. and Vidoni, R., 2021. Emerging research fields in safety
and ergonomics in industrial collaborative robotics: A systematic literature review.
Robotics and Computer-Integrated Manufacturing, 67, p.101998.
11. Amorim, A., Guimares, D., Mendona, T., Neto, P., Costa, P. and Moreira, A.P.,
2021. Robust human position estimation in cooperative robotic cells. Robotics and
Computer-Integrated Manufacturing, 67, p.102035.
12. Munaro, M., Lewis, C., Chambers, D., Hvass, P. and Menegatti, E., 2016. RGB-D
human detection and tracking for industrial environments. In Intelligent Autonomous
Systems 13: Proceedings of the 13th International Conference IAS-13 (pp. 1655-1668).
Springer International Publishing.
13. Chemweno, P. and Torn, R.J., 2022. Innovative safety zoning for collaborative robots
utilizing Kinect and LiDAR sensory approaches. Procedia CIRP, 106, pp.209-214.
14. Vogel-Heuser, B. and Hess, D., 2016. Guest editorial Industry 4.0–prerequisites and
visions. IEEE Transactions on automation Science and Engineering, 13(2), pp.411-413.
15. Xu, X., Lu, Y., Vogel-Heuser, B. and Wang, L., 2021. Industry 4.0 and Industry
5.0—Inception, conception and perception. Journal of manufacturing systems, 61,
pp.530-535.
16. Maddikunta, P.K.R., Pham, Q.V., Prabadevi, B., Deepa, N., Dev, K., Gadekallu, T.R.,
Ruby, R. and Liyanage, M., 2022. Industry 5.0: A survey on enabling technologies and
potential applications. Journal of Industrial Information Integration, 26, p.100257.
17. Leng, J., Sha, W., Wang, B., Zheng, P., Zhuang, C., Liu, Q., Wuest, T., Mourtzis, D.
and Wang, L., 2022. Industry 5.0: Prospect and retrospect. Journal of Manufacturing
Systems, 65, pp.279-295.
18. Longo, F., Padovano, A. and Umbrello, S., 2020. Value-oriented and ethical technology
engineering in industry 5.0: A human-centric perspective for the design of the factory
of the future. Applied Sciences, 10(12), p.4182.
19. Battini, D., Berti, N., Finco, S., Zennaro, I. and Das, A., 2022. Towards industry 5.0:
A multi-objective job rotation model for an inclusive workforce. International Journal
of Production Economics, 250, p.108619.
20. Bitsch, G., 2022. Conceptions of man in human-centric cyber-physical production
systems. Procedia CIRP, 107, pp.1439-1443.
21. Wang, L., 2022. A futuristic perspective on human-centric assembly. Journal of Man-
ufacturing Systems, 62, pp.199-201.
22. Li, S., Wang, R., Zheng, P. and Wang, L., 2021. Towards proactive human–robot
collaboration: A foreseeable cognitive manufacturing paradigm. Journal of Manufac-
turing Systems, 60, pp.547-552.
May 3, 2024
23. Wang, H., Lv, L., Li, X., Li, H., Leng, J., Zhang, Y., Thomson, V., Liu, G., Wen, X.,
Sun, C. and Luo, G., 2023. A safety management approach for Industry 5.00 s human-
centered manufacturing based on digital twin. Journal of Manufacturing Systems, 66,
pp.1-12.
24. Fryman, J., 2014, June. Updating the industrial robot safety standard. In
ISR/Robotik 2014; 41st International Symposium on Robotics (pp. 1-4). VDE.
25. Mitka, E., Gasteratos, A., Kyriakoulis, N. and Mouroutsos, S.G., 2012. Safety certifi-
cation requirements for domestic robots. Safety science, 50(9), pp.1888-1897.
26. Harper, C. and Virk, G., 2010. Towards the development of international safety stan-
dards for human robot interaction. International Journal of Social Robotics, 2(3),
pp.229-234.
27. Chemweno, P., Pintelon, L. and Decre, W., 2020. Orienting safety assurance with
outcomes of hazard analysis and risk assessment: A review of the ISO 15066 standard
for collaborative robot systems. Safety Science, 129, p.104832.
28. Reddy, A., Bright, G. and Padayachee, J., 2019. A Review of Safety Methods for
Human-robot Collaboration and a Proposed Novel Approach. ICINCO (1), pp.243-
248.
29. Dian, F.J., Vahidnia, R. and Rahmati, A., 2020. Wearables and the Internet of
Things (IoT), applications, opportunities, and challenges: A Survey. IEEE access,
8, pp.69200-69211.
30. Jeong, S., Kang, S. and Chun, I., 2019, June. Human-skeleton based fall-detection
method using LSTM for manufacturing industries. In 2019 34th International Tech-
nical Conference on Circuits/Systems, Computers and Communications (ITC-CSCC)
(pp. 1-4). IEEE.
31. Arshad, I., KR, R.B., Alsamhi, S.H. and Curry, E., 2023, October. EHRCoI4: A
Novel Framework for Enhancing Human-Robot Collaboration in Industry 4.0. In 2023
3rd International Conference on Emerging Smart Technologies and Applications (eS-
marTA) (pp. 1-6). IEEE.
32. Fu, Y., Chen, J. and Lu, W., 2024. Human-robot collaboration for modular construc-
tion manufacturing: Review of academic research. Automation in Construction, 158,
p.105196.
33. Marinelli, M., 2023. From Industry 4.0 to Construction 5.0: Exploring the Path
towards Human–Robot Collaboration in Construction. Systems, 11(3), p.152.
34. Panagou, S., Neumann, W.P. and Fruggiero, F., 2023. A scoping review of human
robot interaction research towards Industry 5.0 human-centric workplaces. Interna-
tional Journal of Production Research, pp.1-17.
35. Caiazzo, C., Savković, M., Pušica, M., Nikolic, N., Milojevic, D. and Djapan, M., 2023.
Framework of a Neuroergonomic Assessment in Human-Robot Collaboration.
36. Ciccarelli, M., Papetti, A. and Germani, M., 2023. Exploring how new industrial
paradigms affect the workforce: A literature review of Operator 4.0. Journal of Man-
ufacturing Systems, 70, pp.464-483.

May 3, 2024
37. Feddoul, Y., Ragot, N., Duval, F., Havard, V., Baudry, D. and Assila, A., 2023. Ex-
ploring human-machine collaboration in industry: a systematic literature review of
digital twin and robotics interfaced with extended reality technologies. The Interna-
tional Journal of Advanced Manufacturing Technology, pp.1-16.
38. Zhang, Y., Ding, K., Hui, J., Liu, S., Guo, W. and Wang, L., 2024. Skeleton-RGB inte-
grated highly similar human action prediction in human–robot collaborative assembly.
Robotics and Computer-Integrated Manufacturing, 86, p.102659.
39. Caiazzo, C., Savkovic, M., Pusica, M., Milojevic, D., Leva, M.C. and Djapan, M.,
2023. Development of a Neuroergonomic Assessment for the Evaluation of Mental
Workload in an Industrial Human–Robot Interaction Assembly Task: A Comparative
Case Study. Machines, 11(11), p.995.
40. Khosravy, M., Gupta, N., Pasquali, A., Dey, N., Crespo, R.G. and Witkowski, O., 2023.
Human-Collaborative Artificial Intelligence Along With Social Values in Industry 5.0:
A Survey of the State-of-the-Art. IEEE Transactions on Cognitive and Developmental
Systems.
41. Alimam, H., Mazzuto, G., Tozzi, N., Ciarapica, F.E. and Bevilacqua, M., 2023. The
Resurrection of Digital Triplet: A Cognitive Pillar of Human-Machine Integration at
the Dawn of Industry 5.0. Journal of King Saud University-Computer and Information
Sciences, p.101846.
42. Di Pasquale, V., De Simone, V., Giubileo, V. and Miranda, S., 2023. A taxonomy of
factors influencing worker’s performance in human–robot collaboration. IET Collabo-
rative Intelligent Manufacturing, 5(1), p.e12069.
43. Ihrouchen, Z., Souissi, O. and Bensaid, H., 2023, October. Advancing Manufacturing
with Machine Learning: Unlocking the Potential of Reinforcement learning in Industry
4.0. In Industrial Conference of Science and Industry of the Future (ICSIF23).
44. Jacob, F., Grosse, E.H., Morana, S. and König, C.J., 2023. Picking with a robot col-
league: a systematic literature review and evaluation of technology acceptance in hu-
man–robot collaborative warehouses. Computers & Industrial Engineering, p.109262.
45. Zhang, C., Zhou, G., Ma, D., Wang, R., Xiao, J. and Zhao, D., 2023. A deep learning-
enabled human-cyber-physical fusion method towards human-robot collaborative as-
sembly. Robotics and Computer-Integrated Manufacturing, 83, p.102571.
46. Chu, C.H. and Liu, Y.L., 2023. Augmented reality user interface design and experi-
mental evaluation for human-robot collaborative assembly. Journal of Manufacturing
Systems, 68, pp.313-324.
47. Li, S., Zheng, P., Liu, S., Wang, Z., Wang, X.V., Zheng, L. and Wang, L., 2023. Proac-
tive human–robot collaboration: Mutual-cognitive, predictable, and self-organising
perspectives. Robotics and Computer-Integrated Manufacturing, 81, p.102510.
48. Chen, J., Fu, Y., Lu, W. and Pan, Y., 2023. Augmented reality-enabled human-robot
collaboration to balance construction waste sorting efficiency and occupational safety
and health. Journal of Environmental Management, 348, p.119341.
49. Yuan, G., Liu, X., Zhang, C., Pham, D.T. and Li, Z., 2023. A new heuristic algorithm
based on multi-criteria resilience assessment of human–robotcollaboration disassem-
May 3, 2024
bly for supporting spent lithium-ion battery recycling. Engineering Applications of
Artificial Intelligence, 126, p.106878.
50. Terreran, M., Barcellona, L. and Ghidoni, S., 2023. A general skeleton-based action
and gesture recognition framework for human–robot collaboration. Robotics and Au-
tonomous Systems, 170, p.104523.
51. Ghodsian, N., Benfriha, K., Olabi, A., Gopinath, V., Talhi, E., Hof, L.A. and Arnou,
A., 2023. A framework to integrate mobile manipulators as cyber–physical systems into
existing production systems in the context of industry 4.0. Robotics and Autonomous
Systems, 169, p.104526.
52. Ni, S., Zhao, L., Li, A., Wu, D. and Zhou, L., 2023. Cross-View Human Intention
Recognition for Human-Robot Collaboration. IEEE Wireless Communications, 30(3),
pp.189-195.
53. Roda-Sanchez, L., Garrido-Hidalgo, C., García, A.S., Olivares, T. and Fernández-
Caballero, A., 2023. Comparison of RGB-D and IMU-based gesture recognition for
human-robot interaction in remanufacturing. The International Journal of Advanced
Manufacturing Technology, 124(9), pp.3099-3111.
54. Lee, M.L., Liang, X., Hu, B., Onel, G., Behdad, S. and Zheng, M., 2023. A Review of
Prospects and Opportunities in Disassembly with Human-Robot Collaboration. Jour-
nal of Manufacturing Science and Engineering, pp.1-26.
55. Li, C., Zheng, P., Yin, Y., Pang, Y.M. and Huo, S., 2023. An AR-assisted Deep
Reinforcement Learning-based approach towards mutual-cognitive safe human-robot
interaction. Robotics and Computer-Integrated Manufacturing, 80, p.102471.
56. Deng, W., Liu, Q., Zhao, F., Pham, D.T., Hu, J., Wang, Y. and Zhou, Z., 2024.
Learning by doing: A dual-loop implementation architecture of deep active learning
and human-machine collaboration for smart robot vision. Robotics and Computer-
Integrated Manufacturing, 86, p.102673.
57. Soori, M., Arezoo, B. and Dastres, R., 2023. Virtual manufacturing in industry 4.0:
A review. Data Science and Management.
58. Kampa, A., 2023. Modeling and Simulation of a Digital Twin of a Production System
for Industry 4.0 with Work-in-Process Synchronization. Applied Sciences, 13(22),
p.12261.
59. Xian, W., Yu, K., Han, F., Fang, L., He, D. and Han, Q.L., 2023. Advanced Manu-
facturing in Industry 5.0: A Survey of Key Enabling Technologies and Future Trends.
IEEE Transactions on Industrial Informatics.
60. Kwolek, B., 2023. Continuous Hand Gesture Recognition for Human-Robot Collabo-
rative Assembly. In Proceedings of the IEEE/CVF International Conference on Com-
puter Vision (pp. 2000-2007).
61. Entezari, A., Aslani, A., Zahedi, R. and Noorollahi, Y., 2023. Artificial intelligence
and machine learning in energy systems: A bibliographic perspective. Energy Strategy
Reviews, 45, p.101017.
62. Singh, N.K., Yadav, M., Singh, V., Padhiyar, H., Kumar, V., Bhatia, S.K. and Show,
P.L., 2023. Artificial intelligence and machine learning-based monitoring and design
of biological wastewater treatment systems. Bioresource technology, 369, p.128486.
May 3, 2024
63. Liu, L., Guo, F., Zou, Z. and Duffy, V.G., 2024. Application, development and future
opportunities of collaborative robots (cobots) in manufacturing: A literature review.
International Journal of Human–Computer Interaction, 40(4), pp.915-932.
64. Zafar, M.H., Langås, E.F. and Sanfilippo, F., 2024. Exploring the synergies between
collaborative robotics, digital twins, augmentation, and industry 5.0 for smart manu-
facturing: A state-of-the-art review. Robotics and Computer-Integrated Manufactur-
ing, 89, p.102769.
65. Loganathan, A. and Ahmad, N.S., 2023. A systematic review on recent advances in
autonomous mobile robot navigation. Engineering Science and Technology, an Inter-
national Journal, 40, p.101343.
66. Low, E.S., Ong, P. and Low, C.Y., 2023. A modified Q-learning path planning ap-
proach using distortion concept and optimization in dynamic environment for au-
tonomous mobile robot. Computers & Industrial Engineering, 181, p.109338.
67. Upadhyay, A., Balodi, K.C., Naz, F., Di Nardo, M. and Jraisat, L., 2023. Implementing
industry 4.0 in the manufacturing sector: Circular economy as a societal solution.
Computers & Industrial Engineering, 177, p.109072.
68. Bai, C., Zhou, H. and Sarkis, J., 2023. Evaluating Industry 4.0 technology and sus-
tainable development goals–a social perspective. International Journal of Production
Research, 61(23), pp.8094-8114.
69. Barata, J. and Kayser, I., 2023. Industry 5.0–Past, present, and near future. Procedia
Computer Science, 219, pp.778-788.
70. Su, D., Zhang, L., Peng, H., Saeidi, P. and Tirkolaee, E.B., 2023. Technical challenges
of blockchain technology for sustainable manufacturing paradigm in Industry 4.0 era
using a fuzzy decision support system. Technological Forecasting and Social Change,
188, p.122275.
71. Nouinou, H., Asadollahi-Yazdi, E., Baret, I., Nguyen, N.Q., Terzi, M., Ouazene, Y.,
Yalaoui, F. and Kelly, R., 2023. Decision-making in the context of Industry 4.0:
Evidence from the textile and clothing industry. Journal of cleaner production, 391,
p.136184.
72. Zhang, C., Wang, Z., Zhou, G., Chang, F., Ma, D., Jing, Y., Cheng, W., Ding, K.
and Zhao, D., 2023. Towards new-generation human-centric smart manufacturing in
Industry 5.0: A systematic review. Advanced Engineering Informatics, 57, p.102121.
73. Li, X., Nassehi, A., Wang, B., Hu, S.J. and Epureanu, B.I., 2023. Human-centric
manufacturing for human-system coevolution in Industry 5.0. CIRP Annals, 72(1),
pp.393-396.
74. Almeida, J.C., Ribeiro, B. and Cardoso, A., 2023. A human-centric approach to aid
in assessing maintenance from the sustainable manufacturing perspective. Procedia
Computer Science, 220, pp.600-607.
75. Coelho, P., Bessa, C., Landeck, J. and Silva, C., 2023. Industry 5.0: the arising of a
concept. Procedia Computer Science, 217, pp.1137-1144.
76. Golovianko, M., Terziyan, V., Branytskyi, V. and Malyk, D., 2023. Industry 4.0 vs.
Industry 5.0: co-existence, Transition, or a Hybrid. Procedia Computer Science, 217,
pp.102-113.
May 3, 2024
77. Gladysz, B., Tran, T.A., Romero, D., van Erp, T., Abonyi, J. and Ruppert, T., 2023.
Current development on the Operator 4.0 and transition towards the Operator 5.0:
A systematic literature review in light of Industry 5.0. Journal of Manufacturing
Systems, 70, pp.160-185.
78. Golovianko, M., Terziyan, V., Branytskyi, V. and Malyk, D., 2023. Industry 4.0 vs.
Industry 5.0: co-existence, Transition, or a Hybrid. Procedia Computer Science, 217,
pp.102-113.
79. Qahtan, S., Alsattar, H.A., Zaidan, A.A., Deveci, M., Pamucar, D., Delen, D. and
Pedrycz, W., 2023. Evaluation of agriculture-food 4.0 supply chain approaches using
Fermatean probabilistic hesitant-fuzzy sets based decision making model. Applied Soft
Computing, 138, p.110170.
80. Yao, X., Ma, N., Zhang, J., Wang, K., Yang, E. and Faccio, M., 2024. Enhancing
wisdom manufacturing as industrial metaverse for industry and society 5.0. Journal
of Intelligent Manufacturing, 35(1), pp.235-255.
81. Rehman, S.U., Giordino, D., Zhang, Q. and Alam, G.M., 2023. Twin transitions & in-
dustry 4.0: Unpacking the relationship between digital and green factors to determine
green competitive advantage. Technology in Society, 73, p.102227.
82. Zhang, C., Wang, Z., Zhou, G., Chang, F., Ma, D., Jing, Y., Cheng, W., Ding, K.
and Zhao, D., 2023. Towards new-generation human-centric smart manufacturing in
Industry 5.0: A systematic review. Advanced Engineering Informatics, 57, p.102121.
83. Xiong, Y., Tang, Y., Kim, S. and Rosen, D.W., 2023. Human-machine collaborative
additive manufacturing. Journal of Manufacturing Systems, 66, pp.82-91.
84. Haleem, A., Javaid, M., Singh, R.P., Suman, R. and Khan, S., 2023. Management 4.0:
Concept, applications and advancements. Sustainable Operations and Computers, 4,
pp.10-21.
85. Peter, O., Pradhan, A. and Mbohwa, C., 2023. Industry 4.0 concepts within the
sub–Saharan African SME manufacturing sector. Procedia Computer Science, 217,
pp.846-855.
86. Tiwari, S., Bahuguna, P.C. and Srivastava, R., 2023. Smart manufacturing and sus-
tainability: a bibliometric analysis. Benchmarking: An International Journal, 30(9),
pp.3281-3301.
87. Tetteh, E., Wang, T., Kim, J., Smith, T., Norasi, H., Van Straten, M., Lal, G.,
Chrouser, K., Shao, J.M. and Hallbeck, M.S., 2023. Optimizing ergonomics during
open, laparoscopic, and robotic-assisted surgery: A review of surgical ergonomics lit-
erature and development of educational illustrations. The American journal of surgery.
88. Radin Umar, R.Z., Tiong, J.Y., Ahmad, N. and Dahalan, J., 2023. Development of
framework integrating ergonomics in Lean’s Muda, Muri, and Mura concepts. Pro-
duction Planning & Control, pp.1-9.
89. Liao, L., Liao, K., Wei, N., Ye, Y., Li, L. and Wu, Z., 2023. A holistic evaluation of
ergonomics application in health, safety, and environment management research for
construction workers. Safety science, 165, p.106198.
90. Li, W., Hu, Y., Zhou, Y. and Pham, D.T., 2023. Safe human–robot collaboration for
industrial settings: a survey. Journal of Intelligent Manufacturing, pp.1-27.
May 3, 2024
91. Chen, J., Fu, Y., Lu, W. and Pan, Y., 2023. Augmented reality-enabled human-robot
collaboration to balance construction waste sorting efficiency and occupational safety
and health. Journal of Environmental Management, 348, p.119341.
92. Li, C., Zheng, P., Yin, Y., Pang, Y.M. and Huo, S., 2023. An AR-assisted Deep
Reinforcement Learning-based approach towards mutual-cognitive safe human-robot
interaction. Robotics and Computer-Integrated Manufacturing, 80, p.102471.
93. Yu, H., Kamat, V.R., Menassa, C.C., McGee, W., Guo, Y. and Lee, H., 2023. Mu-
tual physical state-aware object handover in full-contact collaborative human-robot
construction work. Automation in Construction, 150, p.104829.
94. Wan, P.K. and Leirmo, T.L., 2023. Human-centric zero-defect manufacturing: State-
of-the-art review, perspectives, and challenges. Computers in Industry, 144, p.103792.
95. Javaid, M., Haleem, A., Singh, R.P. and Suman, R., 2023. An integrated outlook of
Cyber–Physical Systems for Industry 4.0: Topical practices, architecture, and appli-
cations. Green Technologies and Sustainability, 1(1), p.100001.
96. Wang, B., Zhou, H., Li, X., Yang, G., Zheng, P., Song, C., Yuan, Y., Wuest, T., Yang,
H. and Wang, L., 2024. Human Digital Twin in the context of Industry 5.0. Robotics
and Computer-Integrated Manufacturing, 85, p.102626.
97. Malburg, L., Hoffmann, M. and Bergmann, R., 2023. Applying MAPE-K control
loops for adaptive workflow management in smart factories. Journal of Intelligent
Information Systems, 61(1), pp.83-111.
98. Singh, S.A., Kumar, A.S. and Desai, K.A., 2023. Comparative assessment of common
pre-trained CNNs for vision-based surface defect detection of machined components.
Expert Systems with Applications, 218, p.119623.
99. Bärnthaler, R. and Gough, I., 2023. Provisioning for sufficiency: envisaging production
corridors. Sustainability: Science, Practice and Policy, 19(1), p.2218690.
100. Rivera, G., Porras, R., Florencia, R. and Sánchez-Solís, J.P., 2023. LiDAR applications
in precision agriculture for cultivating crops: A review of recent advances. Computers
and Electronics in Agriculture, 207, p.107737.
101. Chen, R., Shu, H., Shen, B., Chang, L., Xie, W., Liao, W., Tao, Z., Bowers, J.E.
and Wang, X., 2023. Breaking the temporal and frequency congestion of LiDAR by
parallel chaos. Nature Photonics, 17(4), pp.306-314.
102. Kim, S.H., Lee, S.Y., Zhang, Y., Park, S.J. and Gu, J., 2023. Carbon-Based Radar Ab-
sorbing Materials toward Stealth Technologies. Advanced Science, 10(32), p.2303104.
103. Bahmanziari, S. and Zamani, A.A., 2024. A new framework of piezoelectric smart
tiles based on magnetic plucking, mechanical impact, and mechanical vibration force
mechanisms for electrical energy harvesting. Energy Conversion and Management,
299, p.117902.
104. Sun, Y., Li, Y.Z. and Yuan, M., 2023. Requirements, challenges, and novel ideas for
wearables on power supply and energy harvesting. Nano Energy, 115, p.108715.
105. Powell, D. and Godfrey, A., 2023. Considerations for integrating wearables into the
everyday healthcare practice. NPJ Digital Medicine, 6(1), p.70.

May 3, 2024
106. Spagnolo, F., Corsonello, P., Frustaci, F. and Perri, S., 2023, September. Approximate
Foveated-Based Super Resolution Method for Headset Displays. In Annual Meeting
of the Italian Electronics Society (pp. 338-344). Cham: Springer Nature Switzerland.
107. Goh, T.L. and Peh, L.S., 2024. WalkingWizard-A truly wearable EEG headset for
everyday use. ACM Transactions on Computing for Healthcare.
108. Wan, W., Sun, W., Zeng, Q., Pan, L. and Xu, J., 2024, January. Progress in ar-
tificial intelligence applications based on the combination of self-driven sensors and
deep learning. In 2024 4th International Conference on Consumer Electronics and
Computer Engineering (ICCECE) (pp. 279-284). IEEE.
109. Li, P.Z.X., Karaman, S. and Sze, V., 2024. GMMap: Memory-Efficient Continuous
Occupancy Map Using Gaussian Mixture Model. IEEE Transactions on Robotics.
110. Abubakar, I.R. and Alshammari, M.S., 2023. Urban planning schemes for developing
low-carbon cities in the Gulf Cooperation Council region. Habitat International, 138,
p.102881.
111. Rahman, M.M. and Thill, J.C., 2023. Impacts of connected and autonomous vehicles
on urban transportation and environment: A comprehensive review. Sustainable Cities
and Society, p.104649.
112. Semeraro, F., Griffiths, A. and Cangelosi, A., 2023. Human–robot collaboration and
machine learning: A systematic review of recent research. Robotics and Computer-
Integrated Manufacturing, 79, p.102432.
113. Robinson, N., Tidd, B., Campbell, D., Kulić, D. and Corke, P., 2023. Robotic vision
for human-robot interaction and collaboration: A survey and systematic review. ACM
Transactions on Human-Robot Interaction, 12(1), pp.1-66.
114. Li, S., Zheng, P., Liu, S., Wang, Z., Wang, X.V., Zheng, L. and Wang, L., 2023. Proac-
tive human–robot collaboration: Mutual-cognitive, predictable, and self-organising
perspectives. Robotics and Computer-Integrated Manufacturing, 81, p.102510.
115. Zhang, M., Xu, R., Wu, H., Pan, J. and Luo, X., 2023. Human–robot collaboration
for on-site construction. Automation in Construction, 150, p.104812.
116. Lee, J., Bennett, C.C., Stanojevic, C., Kim, S., Henkel, Z., Baugus, K., Piatt, J.A.,
Bethel, C. and Sabanovic, S., 2023. Detecting cultural identity via robotic sensor
data to understand differences during human-robot interaction. Advanced Robotics,
37(22), pp.1446-1459.
117. Olugbade, T., He, L., Maiolino, P., Heylen, D. and Bianchi-Berthouze, N., 2023. Touch
technology in affective human–, robot–, and virtual–human interactions: A survey.
Proceedings of the IEEE.
118. Søraa, R.A., Tøndel, G., Kharas, M.W. and Serrano, J.A., 2023. What do older adults
want from social robots? A qualitative research approach to human-robot interaction
(HRI) studies. International Journal of Social Robotics, 15(3), pp.411-424.
119. Mehak, S., Kelleher, J.D., Guilfoyle, M. and Leva, M.C., 2024. Action Recognition
for Human–Robot Teaming: Exploring Mutual Performance Monitoring Possibilities.
Machines, 12(1), p.45.

May 3, 2024
120. Su, H., Qi, W., Chen, J., Yang, C., Sandoval, J. and Laribi, M.A., 2023. Recent
advancements in multimodal human–robot interaction. Frontiers in Neurorobotics,
17, p.1084000.
121. Gongor, F. and Tutsoy, O., 2024. On the Remarkable Advancement of Assis-
tive Robotics in Human-Robot Interaction-Based Health-Care Applications: An Ex-
ploratory Overview of the Literature. International Journal of Human–Computer In-
teraction, pp.1-41.
122. Qi, J., Ma, L., Cui, Z. and Yu, Y., 2024. Computer vision-based hand gesture recog-
nition for human-robot interaction: a review. Complex & Intelligent Systems, 10(1),
pp.1581-1606.
123. Wang, D., Zhang, B., Zhou, J., Xiong, Y., Liu, L. and Tan, Q., 2024. Three-
dimensional mapping and immersive human–robot interfacing utilize Kinect-style
depth cameras and virtual reality for agricultural mobile robots. Journal of Field
Robotics.
124. Jamshad, R., Haripriyan, A., Sonti, A., Simkins, S. and Riek, L.D., 2024, March.
Taking initiative in human-robot action teams: How proactive robot behaviors af-
fect teamwork. In Companion of the 2024 ACM/IEEE International Conference on
Human-Robot Interaction (pp. 559-562).
125. Dihan, M.S., Akash, A.I., Tasneem, Z., Das, P., Das, S.K., Islam, M.R., Islam, M.M.,
Badal, F.R., Ali, M.F., Ahmed, M.H. and Abhi, S.H., 2024. Digital Twin: Data
Exploration, Architecture, Implementation and Future. Heliyon.
126. Khan, T.H., Noh, C. and Han, S., 2023. Correspondence measure: a review for the
digital twin standardization. The International Journal of Advanced Manufacturing
Technology, 128(5-6), pp.1907-1927.
127. Sharma, R. and Gupta, H., 2024. Leveraging cognitive digital twins in industry 5.0 for
achieving sustainable development goal 9: An exploration of inclusive and sustainable
industrialization strategies. Journal of Cleaner Production, p.141364.
128. Arisekola, K. and Madson, K., 2023. Digital twins for asset management: Social
network analysis-based review. Automation in Construction, 150, p.104833.
129. Jørgensen, C.S., Shukla, A. and Katt, B., 2023, September. Digital Twins in Health-
care: Security, Privacy, Trust and Safety Challenges. In European Symposium on
Research in Computer Security (pp. 140-153). Cham: Springer Nature Switzerland.
130. Masi, M., Sellitto, G.P., Aranha, H. and Pavleska, T., 2023. Securing critical infras-
tructures with a cybersecurity digital twin. Software and Systems Modeling, 22(2),
pp.689-707.
131. Soori, M., Arezoo, B. and Dastres, R., 2023. Virtual manufacturing in industry 4.0:
A review. Data Science and Management.
132. Martinetti, A., Nizamis, K., Chemweno, P., Goulas, C., van Dongen, L.A., Gibson, I.,
Thiede, S., Lutters, E., Vaneker, T. and Bonnema, G.M., 2024. More than 10 years of
industry 4.0 in the Netherlands: an opinion on promises, achievements, and emerging
challenges. International Journal of Sustainable Engineering, 17(1), pp.1-12.

May 3, 2024
133. Cole, A., 2023. The Ethics of the Personal Digital Twin. In Human Data Interaction,
Disadvantage and Skills in the Community: Enabling Cross-Sector Environments for
Postdigital Inclusion (pp. 79-92). Cham: Springer International Publishing.
134. Zhao, J., Feng, X., Tran, M.K., Fowler, M., Ouyang, M. and Burke, A.F., 2024. Bat-
tery safety: Fault diagnosis from laboratory to real world. Journal of Power Sources,
598, p.234111.
135. Mallioris, P., Aivazidou, E. and Bechtsis, D., 2024. Predictive maintenance in Industry
4.0: A systematic multi-sector mapping. CIRP Journal of Manufacturing Science and
Technology, 50, pp.80-103.
136. Teicher, U., Ben Achour, A., Selbmann, E., Demir, O.E., Arabsolgar, D., Cassina, J.,
Ihlenfeldt, S. and Colledani, M., 2023, June. The RaRe2 Attempt as a Holistic Plat-
form for Decision Support in Rapidly Changing Process Chains. In Proceedings of the
Changeable, Agile, Reconfigurable and Virtual Production Conference and the World
Mass Customization & Personalization Conference (pp. 347-356). Cham: Springer
International Publishing.
137. Tao, Y., You, S., Zhu, J. and You, F., 2024. Energy, climate, and environmental sus-
tainability of trend toward occupational-dependent hybrid work: Overview, research
challenges, and outlook. Journal of Cleaner Production, p.141083.
138. Belcher, E.J. and Abraham, Y.S., 2023. Lifecycle Applications of Building Information
Modeling for Transportation Infrastructure Projects. Buildings, 13(9), p.2300.
139. BOUABID, D.A., HADEF, H. and INNAL, F., 2024. Maintenance as a Sustainability
Tool in High-Risk Process Industries: a Review and Future Directions. Journal of
Loss Prevention in the Process Industries, p.105318.
140. Kim, H., So, K.K.F., Shin, S. and Li, J., 2024. Artificial intelligence in hospitality and
tourism: Insights from industry practices, research literature, and expert opinions.
Journal of Hospitality & Tourism Research, p.10963480241229235.
141. Kiourtis, A., Mavrogiorgou, A. and Kyriazis, D., 2023, December. A Cross-Sector
Data Space for Correlating Environmental Risks with Human Health. In European,
Mediterranean, and Middle Eastern Conference on Information Systems (pp. 234-247).
Cham: Springer Nature Switzerland.
142. Harrer, S., Menard, J., Rivers, M., Green, D.V., Karpiak, J., Jeliazkov, J.R., Shapo-
valov, M.V., del Alamo, D. and Sternke, M.C., 2024. Artificial intelligence drives the
digital transformation of pharma. In Artificial Intelligence in Clinical Practice (pp.
345-372). Academic Press.
143. Ma, J., Yang, L., Wang, D., Li, Y., Xie, Z., Lv, H. and Woo, D., 2024. Digitalization
in response to carbon neutrality: Mechanisms, effects and prospects. Renewable and
Sustainable Energy Reviews, 191, p.114138.
144. Mouta, A., Pinto-Llorente, A.M. and Torrecilla-Sánchez, E.M., 2023. Uncovering blind
spots in education ethics: Insights from a systematic literature review on artificial
intelligence in education. International Journal of Artificial Intelligence in Education,
pp.1-40.

May 3, 2024
List of Tables
1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
4 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
5 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
6 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68

May 3, 2024
Table 1: Review Methodology and Findings: Human Detection Technology in Industrial
Manufacturing for Human-Robot Collaboration
S.No Title Methodology Search Criteria Results
1 Human in the Loop: A Rigorous search January 2009 - September 2020, 9 relevant publications
Review of Human methodology Google Scholar, Scopus, Web of identified
Detection Technology in Science
Industrial Manufacturing
Environments for Human
Robot Collaboration
2 Review Objective Thorough search Keywords: "Human Detection Excluded irrelevant
on prominent web- Technology," "Industrial literature, resulting in 9
based databases Manufacturing," "Human Robot pertinent publications
Collaboration," "Human-Robot
Interaction in Manufacturing"
3 Focus Meticulous Excluded: HRI in industrial Identified literature
selection of search processes, economic evaluations contributing to
terms or feasibility studies of understanding of human
disassembly lines, process detection technology in
optimization of disassembly line industrial manufacturing
designs, design of tools for HRC for HRC
applications
4 Conclusion Comprehensive Emphasized relevance, excluded Aimed to provide
overview and irrelevant papers insights into
analysis autonomous robotic
disassembly systems
and solutions for
industrial disassembly
tasks
Table 2: Principles of Human-Robot Collaboration: Description, Challenges, and Mitigation
Strategies
S.No Principle Description Example Challenges Mitigation
Strategies
1 Mutual Humans and robots Using natural Language barriers, Implementing
Understanding and should comprehend language processing ambiguity in multi-modal
Communication each other's for dialogue. communication. communication
intentions and systems.
communicate
effectively.
2 Safety and Trust Safety protocols must Emergency stop Accidental Rigorous testing,
be in place, and trust buttons and safety collisions, mistrust clear
between human and sensors. in automation. communication of
robot must be safety features.
established.
3 Task Allocation Tasks should be Robots handling Difficulty in task Clearly defined
and Coordination allocated based on repetitive tasks, handover, roles, regular
capabilities, and humans overseeing conflicting coordination
coordination should complex decisions. priorities. meetings.
be seamless.
4 Adaptability and Systems should adapt Robots adjusting to Resistance to Modular system
Flexibility to changing new environments or change, technical design, frequent
conditions, and tasks. limitations. updates and
flexibility should be maintenance.
inherent in roles.
5 Transparency and Operations should be Providing logs of Black box Using interpretable
Explainability transparent, and robot actions for algorithms, lack of models, providing
robots' actions should review. interpretability. transparency
be explainable to reports.
humans.
6 Shared Goals and Humans and robots Collaboratively Misalignment of Regular goal-setting
Objectives should work towards assembling products goals, differing meetings, shared
common objectives, on an assembly line. priorities. performance
aligning their efforts. metrics.
7 Ethical Ethical dilemmas Ensuring robots do Job displacement Ethical frameworks,
Considerations and should be addressed, not replace human fears, ethical involving
Human and human jobs but enhance decision-making. stakeholders in
Empowerment empowerment should productivity. decision-making.
be prioritized.
8 Continuous Systems should learn Using machine Data biases, Diverse datasets,
Learning and from experience and learning to optimize overfitting. regular model re-
Improvement strive for continuous task performance. evaluation.
improvement.
Table 3: Comprehensive Overview of Robot Safety: Aspects, Hazards, and Precautionary
Measures
S.No Safety Aspect Description Potential Hazards Safety Measures
1 Risk Assessment Conducting thorough risk Collision with humans or Utilize risk assessment tools
assessments to identify objects, entrapment, such as ISO 12100, establish
potential hazards and assess electrical hazards. safety zones, implement safety-
their severity. rated sensors.
2 Emergency Stop Implementing mechanisms Delayed response, Regular testing and
Systems for rapid shutdown in improper functioning of maintenance, redundant
emergency situations to emergency stop buttons. emergency stop buttons, clear
prevent harm. signage.
3 Protective Installing physical barriers to Inadequate barrier Regular inspection, adherence
Barriers prevent access to hazardous design, obstruction of to safety standards such as ISO
areas during robot operation. visibility. 10218, warning signs.
4 Safety Interlocks Incorporating interlock Failure of interlock Regular maintenance, testing of
systems to ensure that system, bypassing safety interlock functionality, training
certain conditions are met mechanisms. on proper usage.
before operation.
5 Human-Robot Designing robots capable of Collision with humans, Use of collaborative robots
Collaboration safely interacting with misinterpretation of (cobots), implementation of
humans, including collision human gestures. safety-rated sensors, employee
avoidance features. training.
6 Training and Providing comprehensive Lack of training, operator Regular training sessions,
Education training to operators on safe error. simulation of emergency
robot operation and scenarios, clear operating
emergency procedures. manuals.
7 Safety Standards Ensuring that robots adhere Non-compliance with Regular audits, certification by
Compliance to established safety safety standards, outdated recognized safety
standards and regulations. safety features. organizations, staying informed
about updates to safety
standards.
8 Preventive Conducting routine Equipment malfunction Establishing maintenance
Maintenance maintenance to identify and due to neglect, failure to schedules, regular inspection of
address potential safety identify wear and tear. equipment, recording
hazards before they occur. maintenance activities.
9 Hazard Clearly communicating Inadequate signage, Multilingual signage, use of
Communication hazards associated with language barriers. universal symbols, regular
robot operation through safety briefings.
signage and labels.
10 Continuous Implementing processes for Failure to address Establishing safety committees,
Improvement ongoing evaluation and emerging hazards, conducting regular safety
improvement of robot safety complacency. audits, encouraging feedback
measures. from employees.
Table 4: Advancing Safety and Interaction in Collaborative Workspaces: Strategies for Control
and Human-Robot Communication
S.No Aspect Description Challenges Control Strategies and
Communication Techniques
1 Risk Assessment Conducting thorough risk Complexity of Utilizing real-time risk
assessments to identify potential collaborative assessment algorithms,
hazards and prioritize safety environments, dynamic integrating safety feedback
measures. risk factors. mechanisms.
2 Human-Robot Designing collaborative robots Ensuring seamless Implementing force and
Collaboration capable of safely interacting interaction without torque sensors, designing
with humans while maintaining compromising safety. intuitive human-robot
productivity. interfaces.
3 Safety Interlocks Incorporating interlock systems Delayed response time, Introducing redundant safety
and Emergency and emergency stop balancing safety with systems, integrating
Stop mechanisms to ensure rapid operational efficiency. predictive emergency stop
shutdown in hazardous algorithms.
situations.
4 Real-Time Employing sensors and Data overload, ensuring Utilizing AI-driven anomaly
Monitoring and monitoring systems to provide accuracy and reliability detection, visual and auditory
Feedback real-time feedback on of sensor data. feedback mechanisms.
environmental changes and
hazards.
5 Training and Providing comprehensive Knowledge transfer, Simulated training scenarios,
Education training programs to both ensuring training interactive learning
humans and robots for safe and relevance in dynamic platforms, continuous skills
effective collaboration. environments. development.
Table 5: Exploring Robotics: Studies in Robot Models, Sensor Technologies, and Human-Robot
Interaction
S.No Authors Robot Model Sensor Technology AI/Algorithm Challenges Brief Description
Used
1 Semeraro Not specified Microsoft KinectTM Proximity Query Environmental noise Developed controls
et al. 2023 Package affecting sensor for Cyber-Physical
[112] accuracy. Systems in
collaborative
environments.
2 Robinson UR-5/UR3 Microsoft KinectTM Robotiq Data interoperability Proposed framework
et al. 2023 and integration for digital twin
[113] challenges. utilization in
assembly
workstations.
3 Li et al. UR-5 Microsoft Amazon Sumerian Real-time Introduced virtual
2023 [114] KinectTM/Photosensors synchronization of assembly
virtual and physical workstations with
assembly. real-time feedback.
4 Zhang et ABB IRB Infrared Sensors Savitzky–Golay Noise filtering in Improved human-
al. 2023 2600 Filtering Method dynamic industrial robot interaction
[115] environments. accuracy with sensor
data filtering.
5 Lee et al. SCARA Not specified Not specified Interoperability Addressing
2023 Adept 604-S/ between diverse compatibility issues
[116] KUKA robot platforms. in multi-robot
KR30–3/ environments.
Nachi MZ07
6 Olugbade COMAU Kinect Azure Convolutional Real-time data Implementing VR
et al. 2023 AURA Neural Networks processing for technologies for
[117] (Actions Perception seamless VR enhanced task
Module) experience. perception.
7 Søraa et al. Not specified LIDAR Collision Detection Real-time collision Developing digital
2023 detection in dynamic twins for safety-
[118] environments. critical applications.
8 Mehak et Not specified OptiTrack Motion Estimation Real-time motion Analyzing
al. 2024 tracking in complex ergonomic factors
[119] work environments. and motion in
collaborative tasks.
9 Su et al. KUKA IIWA Microsoft Hololens 2 Deep Conventional Real-time AR Integrating AR
2023 Neural Network visualization in technologies for
[120] dynamic improved
workspaces. collaboration.
10 Gongor Not specified Microsoft Not specified Not specified Not specified
and Tutsoy KinectTM/LIDAR/Light
(2024) barrier
[121]
11 Qi et al. UR-10 Microsoft HoloLens 2 Not specified Not specified Not specified
2024
[122]
12 Wang et al. Not specified Head-Mounted Display Not specified Not specified Not specified
2024
[123]
Table 6: Insights into Digital Twins: Sector-Specific Applications, Challenges, and Simulation
Parameters
S.No Study Industry Operation Type Simulation Insights Challenges
Sector
1 Sharma and Manufacturing Gesture Controlled simulation Data interoperability,
Gupta, (2024) Recognition environment for modeling Real-time processing
[124] human gestures.
2 Arisekola and Industrial Assembly Controlled simulation Model accuracy,
Madson, environment for what-if Integration with
(2023) [125] analysis in assembly existing systems
processes.
3 Jørgensen et al. Assembly Layout Design Controlled simulation Real-time monitoring,
[126] Line environment for workstation Predictive
layout design in assembly maintenance
lines.
4 Masi et al. Manufacturing Safety Assessment Controlled simulation Data integration, IoT
[127] environment for evaluating integration
workplace safety in
manufacturing settings.
5 Soori et al. Automation Robot Controlled simulation Model accuracy, Real-
[128] Programming environment for programming time processing
robots in automated systems.
6 Martinetti et al. Production Ergonomic Realistic simulation Data interoperability,
[129] Assessment environment for assessing Real-time monitoring
ergonomic factors in
production settings.
7 Cole, 2023 Manufacturing Safety Assessment Controlled simulation Model accuracy,
[130] environment for analyzing Integration with
workplace safety in existing systems
manufacturing environments.
8 Zhao et al. Industrial Ergonomic Controlled simulation Real-time monitoring,
[131] Assessment environment for evaluating Data integration
ergonomic aspects of
industrial workstations.
9 Mallioris et al. Industrial Communication Controlled simulation Real-time processing,
[132] Enhancement environment for enhancing Integration with
communication between existing systems
human operators and robots.
10 Teicher et al. Production Safety Assessment Controlled simulation Data interoperability,
[133] environment for assessing Model accuracy
workplace safety in
production facilities.
11 Tao et al. [134] Manufacturing Task Planning Controlled simulation Real-time monitoring,
environment for optimizing Data integration
workflow and task planning
in manufacturing.
12 Belcher and Automation Safety Assessment Controlled simulation Data integration, IoT
Abraham, 2023 environment for evaluating integration
[135] workplace safety in
automated systems.
13 BOUABID et Food Task Allocation Controlled simulation Model accuracy, Real-
al. [136] Processing environment for optimizing time processing
task allocation in food
processing facilities.
14 Kim et al. Industrial Path Movement Controlled simulation Data interoperability,
[137] environment for planning Real-time monitoring
robotic path movements in
industrial settings.
15 Kiourtis et al. Industrial Cognitive Controlled simulation Integration with
[138] Ergonomic environment for assessing existing systems,
Assessment cognitive ergonomic factors Real-time processing
in industrial workspaces.
16 Harrer et al. Manufacturing Safety Assessment Controlled simulation Data integration,
[139] environment for analyzing Model accuracy
workplace safety in
manufacturing environments.
17 Ma et al. [140] Automation Task Allocation Controlled simulation Real-time monitoring,
environment for optimizing Data interoperability
task allocation in automated
systems.
18 Mouta et al. Automotive Safety Assessment Controlled simulation Real-time processing,
[141] environment for evaluating Integration with
vehicle safety in automotive existing systems
manufacturing.
19 Jamshad et al. Industrial Safety Assessment Controlled simulation Data integration,
[142] environment for assessing Model accuracy
workplace safety in industrial
settings.

You might also like