Smart Manufacturing Review Fin 240705 123026
Smart Manufacturing Review Fin 240705 123026
Smart Manufacturing Review Fin 240705 123026
Abstract
Human detection technology in industrial manufacturing for Human Robot Collaboration
(HRC) is pivotal for safety by preventing accidents, optimizing efficiency, and ensuring
compliance with regulations. It fosters worker confidence, cost savings, and technological
progress, making it a cornerstone of modern industrial practices. The current shortcom-
ings in include accuracy and reliability issues, false alarms, adaptability challenges, and
cost barriers, which need to be addressed to enhance safety and efficiency. This paper
provides a detailed examination of the integration of human operators into the framework
of Industry 4.0, where advanced automated manufacturing systems redefine their roles. It
conducts a comprehensive analysis of HRC and Interaction HRI within the Industry 4.0
context, highlighting the pivotal significance of 3D spatial awareness for fostering efficient
and secure collaboration between humans and robots in shared workspaces. The paper
extensively explores various technologies and methodologies for human detection, encom-
passing both established approaches and emerging techniques. It unveils the multifaceted
challenges and exciting opportunities entailed in integrating humans into industrial settings,
offering insights for future research and application. In particular, it focuses on the poten-
tial of emerging technologies to reshape the landscape of human integration in advanced
robotics, ultimately contributing to the creation of safer and more efficient manufacturing
environments. This review aims to support the robotics community in advancing human
detection technology in industrial manufacturing for enhanced human-robot collaboration,
discuss identified literature gaps, and suggest future research directions in this area.
Keywords: Human Robot Collaboration (HRC), Industry 4.0 Integration, 3D Spatial
Awareness, Human Detection Technologies, Safety in Manufacturing
May 3, 2024
Leveraging smart manufacturing sensing technology, humans and industrial robots can now
safely share the same workspace, working alongside each other [10]. To enable efficient and
safe HRC systems, there is a pressing need for these systems to locally detect the presence of
humans within their vicinity. These human detection systems for HRC can range from simple
object detection using 2D Lidar to more advanced techniques encompassing human and pose
detection using RGB cameras [11]. Emerging technologies, including RGB-D sensors and
wearables, have also played pivotal roles in enhancing human detection capabilities [13].
As robotic systems continue to advance, and with the introduction of mobile robots
(mobots), a growing need arises to understand how humans can be seamlessly integrated
into the manufacturing process on a larger, factory-wide scale. The review contributes to
Industry 4.0 by addressing the critical challenge of redefining human roles in a cyber-physical
manufacturing landscape, emphasizing the importance of seamless HRC for efficiency and
safety. It explores the role of advanced human detection systems, such as RGB-D sensors
and wearables, in enabling HRC, facilitating the integration of humans into manufacturing
processes on a larger, factory-wide scale. This review is essential as it guides the industry
in leveraging emerging technologies to optimize operations, foster connectivity, and adapt
to changing market demands within the fourth industrial revolution. The paper makes
three significant contributions within the context of the manufacturing evolution and the
challenges posed by Industry 4.0:
1. Comprehensive Exploration of Industry 4.0 Integration: The paper exten-
sively explores human integration in Industry 4.0, providing a detailed overview of the core
principles and technologies in smart manufacturing. It dissects the associated challenges and
opportunities, offering valuable insights for researchers, policymakers, and industry profes-
sionals navigating this transformative era.
2. In-Depth Analysis of HRC: The paper’s central contribution lies in its comprehen-
sive analysis of HRC and HRI within Industry 4.0, emphasizing the vital role of 3D spatial
awareness for safe human-robot collaboration. It explores diverse human detection technolo-
gies and safety measures, offering a practical roadmap for researchers and practitioners in
advanced manufacturing, enhancing the efficiency and safety of human-robot partnerships.
3. Insights for Future Research and Application: The paper goes beyond summa-
rizing existing knowledge by highlighting future research and application areas. It explores
emerging technologies like RGB-D sensors, wearables, and mobile robots, providing forward-
looking insights that guide scholars and industry leaders to remain innovative in the evolving
field of human integration in advanced robotics.
This comprehensive review article aims to delve into the intricate facets of integrating hu-
mans into industrial manufacturing environments, particularly in the context of Industry 4.0.
In Section 2, we delve into the details of the search methodology employed in this study.
Section 3 is dedicated to an in-depth exploration of the fundamental principles that underlie
human-robot collaboration. Section 4 critically examines the significance and multifaceted
challenges of involving humans in the evolving landscape of Industry 4.0. Section 5 takes
a deep dive into the technology and methodologies employed for human detection within
the context of smart manufacturing. Section 6 focuses on the concept of Human-centric
Manufacturing, examining its implications, applications, and its role in shaping the future
May 3, 2024
of industry. Section 7 engages in an extensive discussion of existing literature, highlighting
both the formidable challenges and exciting opportunities presented by the inclusion of hu-
mans in industrial settings. Finally, Section 8 discusses the insightful remarks and Section 9
summarizes the critical insights gained from this dynamic and ever-evolving field of research.
2. Search Methodology
In conducting our review titled "Human in the Loop: A Review of Human Detection
Technology in Industrial Manufacturing Environments for Human Robot Collaboration,"
we employed a rigorous search methodology to gather relevant literature and insights. The
review aimed to comprehensively assess the state of human detection technology in the
context of industrial manufacturing and its role in facilitating Human Robot Collaboration
(HRC). To commence our search for the review titled "Human in the Loop: A Review of
Human Detection Technology in Industrial Manufacturing Environments for Human Robot
Collaboration," we conducted a thorough search on prominent web-based databases, such
as Google Scholar, Scopus, and Web of Science. This search spanned the timeframe from
January 2009 to September 2020. We meticulously selected search terms that align with the
scope of our review, emphasizing the crucial elements of our investigation. These keywords
included "Human Detection Technology," "Industrial Manufacturing," "Human Robot Col-
laboration," and "Human-Robot Interaction in Manufacturing."
We maintained a focus on relevance, excluding literature that did not align with the
scope of our review. This involved excluding papers related to HRI in industrial processes,
economic evaluations or feasibility studies of disassembly lines, process optimization of disas-
sembly line designs, and the design of tools for HRC applications. Additionally, we excluded
papers that did not meet the criteria for relevance or did not contribute substantially to
the understanding of current human detection technology and its applications in industrial
manufacturing for HRC. Our meticulous screening process resulted in identifying nine publi-
cations that were deemed pertinent to the field of human-robot collaboration in disassembly.
By employing this robust search methodology, our review aims to provide a comprehensive
overview and analysis of autonomous robotic disassembly systems and robotic solutions
used in industrial disassembly tasks. We seek to shed light on the pivotal role of human
detection technology in facilitating HRC within the context of Industry 4.0 and address
the challenges and advancements in this critical area of manufacturing. Figure 1 outlines
the literature review selection process. It follows the PRISMA methodology, systematically
gathering relevant literature from databases like Google Scholar, Scopus, and Web of Science.
Search terms focused on "Human Detection Technology," "Industrial Manufacturing," and
"Human-Robot Collaboration." A rigorous screening process ensured the inclusion of perti-
nent articles, forming the basis for subsequent analysis. This process provides a transparent
and structured approach to understanding human detection technology’s role in industrial
manufacturing for Human-Robot Collaboration. Table 1 presents a summary of the review
methodology and findings focused on human detection technology in industrial manufac-
turing for human-robot collaboration. It outlines the systematic approach employed in the
review process, including the selection of databases, search terms, and inclusion criteria.
May 3, 2024
Figure 1: Literature Review Selection Process
Additionally, the table provides key findings and insights gleaned from the review, shedding
light on the state of human detection technology, its role in facilitating human-robot collabo-
ration, and the challenges and advancements observed in this critical area of manufacturing.
4. Robot Safety
Different forms or kinds of collisions between robots and humans in industrial settings,
along with their corresponding critical contact force values, can vary based on the specific
circumstances and safety standards. However, here are some general categories of collisions:
1. Unconstrained Impacts: These are unintended collisions between a robot and a
human that occur without any prior safety measures. The critical contact force value for
such impacts may depend on factors like the robot’s speed and the specific application.
2. Clamping in the Robot Structure: This involves a human getting caught or clamped
within the moving parts or structure of a robot. The critical contact force value depends on
the design and force-limiting features of the robot.
3. Constrained Impacts: These are collisions that occur when a robot and a human are
interacting within a predefined workspace. The critical contact force value may be defined
based on safety standards to ensure human safety during such interactions.
4. Partially Constrained Impacts: These are situations where there is limited con-
straint on the motion of the robot, but contact with a human may still occur. [27][28] The
critical contact force value would depend on the level of constraint and the potential risks
involved.
May 3, 2024
Figure 2: A visual representation of the comprehensive architecture, illustrating the distinct modules and
the flow of data. Derived from [26]
5. Secondary Impacts: These refer to additional impacts that may occur as a result
of the initial collision between a robot and a human. The critical contact force values for
secondary impacts would be considered in the context of overall safety.
These are various International Organization for Standardization (ISO) standards related
to robotics and the safety requirements associated with them. Each standard serves a specific
purpose in defining terminology, safety requirements, or guidelines for the design, integration,
and use of robots and robotic devices. Here’s a brief explanation of each:
1. ISO 8373: Vocabulary for robots and robotic devices- This standard provides
a comprehensive vocabulary and terminology related to robots and robotic devices, helping
May 3, 2024
to establish a common language for the industry.
2. ISO 10218-1: Safety requirements for industrial robots - Part 1: Robots:
ISO 10218-1 sets safety requirements and guidelines for the design and operation of individual
industrial robots, focusing on their safety aspects.
3. ISO 10218-2: Safety requirements for industrial robots - Part 2: Robot
systems and integration: This standard extends the safety requirements outlined in ISO
10218-1 to cover the broader context of robot systems and their integration into industrial
processes.
4. ISO 15066: Robots and robotic devices - Collaborative robots: ISO 15066
addresses safety considerations specifically for collaborative robots (cobots), which are de-
signed to work safely alongside humans. It provides guidance on risk assessment, force and
pressure limits, and other safety-related aspects of cobots.
5. ISO 13849-1: Safety of machinery: This standard deals with the design principles
for safety-related parts of control systems in machinery, including robotic systems. [29][30]It
provides guidance on how to design control systems that ensure the safety of operators and
other personnel.
6. ISO 13849-2: Safety of machinery: Safety-related parts of control systems - Part
2: Validation- ISO 13849-2 complements Part 1 by addressing the validation and verification
processes for safety-related control systems. It ensures that the designed safety measures
are effective and reliable.
7. ISO 10270: Manipulating industrial robots: Similar to ISO 8373, ISO 10270
focuses on providing a specific vocabulary and terminology related to manipulating industrial
robots. It helps standardize terminology in this context.
8. ISO/TR 20218-1: Robots and robotic devices - Safety requirements for
industrial robots - Part 1: Criteria for the design of robot safety: This is a Technical
Report (TR) that offers criteria and guidelines for designing safe industrial robots. While
not a mandatory standard, it provides valuable insights for safety-conscious designers.
9. ISO/TR 20218-2: Robots and robotic devices - Safety requirements for
industrial robots - Part 2: Robot integration and use: Similar to Part 1, this
Technical Report focuses on robot integration and use, offering criteria and guidance for
safe integration and operation.
10. ISO/TS 15066: Robots and robotic devices - Collaborative robots - Safety:
ISO/TS 15066 is a Technical Specification (TS) that provides additional safety guidance
specifically for collaborative robots. It outlines safety measures and requirements for cobots
in more detail.
These ISO standards and technical documents play a crucial role in ensuring the safety,
interoperability, and standardization of robotics technology across various industries and
applications. They help manufacturers, integrators, and users of robotic systems adhere
to best practices and reduce the risk of accidents and injuries associated with robotic op-
erations. Figure 3 provides a regulatory perspective on the interplay between machinery,
robots/collaborative robots, and artificial intelligence (AI). The diagram visually illustrates
the complex relationship and interactions among these key elements within the regulatory
framework. Each component, including machinery, robots (both traditional and collabora-
May 3, 2024
Figure 3: The regulatory perspective on the interplay between machinery, robots/collaborative robots, and
AI. Derived from [31]
tive), and AI systems, is represented with distinct symbols or icons. Arrows and lines depict
the flow of regulatory requirements, standards, and guidelines between these components,
highlighting how regulations apply to each element individually and their interconnected-
ness. This visual representation aids in understanding the regulatory landscape governing
the integration of machinery, robots, and AI in various applications. It helps stakeholders,
including policymakers, manufacturers, and researchers, navigate the regulatory environ-
ment and ensure compliance with relevant standards and regulations in the development
and deployment of advanced technological systems. Table 3 offers a comprehensive overview
of robot safety, detailing various aspects, hazards, and precautionary measures associated
with robotics applications. It provides a structured framework for understanding the safety
considerations inherent in working with robots across different industries and contexts. The
table delineates potential hazards posed by robots and machinery, along with corresponding
precautionary measures to mitigate risks and ensure safe operations. By organizing this
information in a clear and systematic manner, Table 3 serves as a valuable reference for
May 3, 2024
practitioners, researchers, and policymakers involved in robotics safety and risk manage-
ment.
4.1. Enhancing Safety and Intuitive Interaction: Control Strategies and Human-Robot Com-
munication Techniques in Collaborative Workspaces
To ensure safe and intuitive interaction between human workers and robots in various
industrial and collaborative settings, it’s essential to implement effective control strategies
and human-robot communication techniques. Here are some key strategies and techniques
that are equally necessary for achieving this goal:
1. Risk Assessment and Safety Measures: Conduct a thorough risk assessment
to identify potential hazards and risks associated with robot-human interactions. [31][32]
Implement safety measures such as physical barriers, safety sensors, emergency stop buttons,
and safety certifications to minimize risks.
[Table 4 about here.]
2. Collaborative Robotics (Cobots): Use collaborative robot designs that are inher-
ently safe for human interaction, featuring softer materials, rounded edges, and lightweight
construction. Implement force and torque sensors to enable robots to detect and respond to
unexpected contact with humans.
3. Programming and Control: Utilize intuitive programming interfaces and control
systems that enable non-expert users to teach robots tasks easily. Implement motion plan-
ning algorithms that consider human presence and safety constraints to ensure smooth and
safe motion.
4. Sensing and Perception:- Equip robots with advanced sensors such as 3D cameras,
LIDAR, and tactile sensors to detect and recognize humans and objects in their environ-
ment. Implement computer vision and machine learning algorithms to enhance perception
capabilities, allowing robots to understand human gestures, poses, and intentions.
5. Collision Avoidance: Develop collision avoidance algorithms that enable robots
to dynamically adjust their paths and speed to avoid collisions with humans and obsta-
cles. [32][33][34] Use sensors and software to create safety zones around robots, triggering
slowdowns or stops when humans enter these zones.
6. Natural Language Interfaces: Implement natural language processing (NLP) and
speech recognition technologies to allow humans to give voice commands or interact with
robots through speech. Enable robots to respond to verbal cues, questions, and instructions
in a user-friendly manner.
7. Haptic Feedback: Integrate haptic feedback mechanisms into robots to provide
tactile feedback to human operators, allowing them to sense forces, pressures, or vibrations
during interaction. This can improve teleoperation and enhance the operator’s sense of
touch.
8. Augmented Reality (AR) and Virtual Reality (VR): Use AR and VR interfaces
to provide real-time information to human workers, such as overlaying instructions, safety
alerts, and visualizations on their field of view. These technologies can enhance situational
awareness and task guidance.
May 3, 2024
9. Training and Education: Provide comprehensive training for human workers on
how to safely and effectively interact with robots. Include simulation-based training and
realistic scenarios to prepare workers for real-world robot collaborations.
10. Continuous Monitoring and Adaptation: Implement systems that continuously
monitor the robot’s environment, human behavior, and task progress. Enable robots to
adapt their behavior in response to changing conditions and unexpected events.
11. User Feedback and Iteration: Encourage feedback from human workers and
incorporate their input into the design and operation of robots. Continuously iterate and
improve both the hardware and software based on user experiences and suggestions.
Figure 4: A comprehensive description of the levels of human-robot interaction, providing a detailed break-
down of the different stages or degrees of interaction between humans and robots. This description should
encompass the spectrum of interaction levels, from minimal or passive engagement to high levels of collab-
oration and cooperation, as well as the corresponding implications for technology, safety, and productivity.
Derived from [35]
Figure 5 outlines the four collaboration modes defined by robot safety standards ISO
10218-1/2 and ISO/TS 15066. These modes dictate how humans and robots interact in
industrial settings to ensure safety and efficiency. These pre-collision and post-collision
control schemes are essential components of ensuring safe and effective human-robot collab-
oration. An effective control strategy often combines elements of both approaches to provide
a comprehensive safety framework. The specific choice of control scheme depends on the
application, the level of human-robot interaction required, and the safety standards and
regulations governing the system. Table 5 delves into the exploration of robotics, encom-
passing studies in robot models, sensor technologies, and human-robot interaction (HRI). It
offers a comprehensive overview of research endeavors aimed at advancing various aspects of
robotics technology. The table highlights developments in robot design, sensor integration,
and HRI methodologies, providing insights into emerging trends and innovations in the field.
By organizing this information in a structured format, Table 5 serves as a valuable reference
for researchers, engineers, and enthusiasts interested in the multifaceted domain of robotics,
fostering collaboration and knowledge exchange in the pursuit of technological advancement.
at a shared conviction, a concept known as grounding. Grice has previously noted that
grounding is achieved when individuals efficiently convey information without undue effort.
[47] Past research has demonstrated the benefits of establishing grounding through verbal
communication, even when combined with other forms of feedback. For instance, Wang and
colleagues [48] found that the effectiveness of haptic communication improved after dyads
had an initial learning phase involving verbal communication to acquaint themselves with
the task. In more complex tasks, studies like that of Parikh and team [49] have suggested
that verbal feedback, when combined with haptic feedback, can significantly enhance team
performance compared to relying solely on haptic feedback.
method that provides tactile feedback directly to the operators’ fingertips to monitor their
operational awareness. They employed a Bayesian recursive classifier to estimate human in-
tention while using a wearable vibrotactile ring to convey information about different stages
of Human-Robot Collaboration (HRC) [55]. Salvietti et al. [56] investigated a bilateral
haptic interface, combining a soft gripper with a wearable remote ring interface to enhance
collaboration effectiveness. Bergner et al. [57] introduced an innovative interface featuring
distributed cells acting as a large-scale "skin" around robot manipulators. This interface
calculates joint torque in contact points, enabling more intuitive human-robot communica-
tion through touch alone [58]. Similarly, Tang et al. [59] developed a novel signaling system
using robot light skin, which significantly improved user reaction times and reduced operator
mental workload during the execution of simple industrial tasks, resulting in fewer errors.
Human-Robot Communication in Virtual, Augmented, and Mixed Realities:
Efficient collaboration between humans and robots necessitates that robots exhibit behav-
iors that are sensitive to human presence and engage in effective communication throughout
the entire process. This enables the establishment of a common ground where shared be-
liefs, desires, and intentions can be understood by all parties involved. Nonetheless, the
present state of natural language processing significantly constrains the breadth of these
interactions. The limitations arise from disparities in how humans and robots represent and
convey information, which, in turn, hinder the robot’s ability to interpret human intentions
and express their own intentions meaningfully. These communication barriers have been a
central focus of previous research in human-robot interaction.
The importance of addressing these barriers becomes especially evident in collaborative
scenarios where mixed human-robot teams work together for reasons related to safety, ef-
ficiency, and ease of use. This perspective is underscored in the U.S. Robotics Roadmap,
which underscores the need for humans to comprehend and discern robot activities to inter-
May 3, 2024
pret the robot’s comprehension of the situation. Recent research in the field of human-robot
interaction has attempted to tackle this challenge through various means. These approaches
include generating motion that conveys intent, creating understandable task plans, using
natural language to articulate intentions, and employing explicit signaling mechanisms. Re-
cent advancements in mixed, augmented, and virtual reality offer an alternative avenue for
facilitating human-robot interactions by utilizing the shared physical space as a canvas for
visual cues to communicate intent. While preliminary work in this domain dates back sev-
eral years (e.g., Interactive Hand Pointer for robot control in a human workspace), recent
technological progress in mixed and virtual reality systems has significantly improved the
viability of this approach. Recent systems have been developed to visualize the paths of
mobile wheelchairs and robots, with results indicating that humans prefer to engage with
a robot when it directly presents its intentions through visual cues. However, most recent
research has primarily introduced passive, non-interactive systems with limited scopes, and
they have not embraced a genuine mixed reality perspective by accounting for the chang-
ing environmental context while projecting information. The recent strides in augmented
and virtual reality have expanded the horizons of possibilities within these environments,
enabling researchers to effectively incorporate augmented reality into human-robot interac-
tions, thus offering fresh avenues for exploration and development in this field.
May 3, 2024
Figure 8: Production facility equipped with Cyber-Physical Systems (CPS). Derived from [15]
In the context of CPPSs, the intelligent control of these production systems is made
possible through embedded networked systems and the Internet of Things (IoT). These
components facilitate the seamless exchange of data and communication between various
elements of the manufacturing environment, from machines and sensors to software and hu-
mans. This interconnectedness empowers CPPSs to adapt to changing conditions, optimize
processes, and enhance overall manufacturing efficiency [15]. The synergy between CPSs
and CPPSs represents a significant step towards achieving the goals of Industry 4.0, where
digitalization and intelligent control converge to create agile and responsive manufacturing
ecosystems. This integration not only improves operational efficiency but also paves the
way for innovative business models, ultimately driving the evolution of modern manufactur-
ing. Figure 8 depicts a production facility enhanced with Cyber-Physical Systems (CPS),
integrating physical processes with computational capabilities. CPS enables real-time data
collection, analysis, and automation, optimizing manufacturing operations. Key features in-
clude sensor networks for data gathering, analytics for decision-making, and connectivity for
seamless integration. Automation streamlines workflows, while human-robot collaboration
ensures efficiency. Remote monitoring allows oversight from anywhere, supporting adaptive
manufacturing to respond to changing demands. This visualization underscores CPS’s role
in enhancing efficiency, flexibility, and competitiveness in modern manufacturing.
extends to human workers, who can benefit from wearable devices equipped with sensors to
monitor their health, location, and safety within the manufacturing environment. Figure 9
delineates the instrumental role of IoT technology in revolutionizing Smart Manufacturing.
It showcases how IoT facilitates data-driven processes, predictive maintenance, and overall
operational excellence, ushering in a new era of efficient and interconnected manufacturing
systems. Through IoT integration, real-time data collection, and analytics, manufacturers
optimize operations, enhance product quality, and achieve greater efficiency. This visu-
alization underscores IoT’s transformative impact on manufacturing, driving productivity,
sustainability, and competitiveness in Industry 4.0.
May 3, 2024
forecast equipment failures, maintenance requirements, and production bottlenecks, reduc-
ing downtime and operational disruptions. These technologies continuously adapt and op-
timize manufacturing processes, enhance customization, and promote safer human-machine
collaboration, thus driving the transformation of Industry 4.0 towards a more efficient and
innovative future of manufacturing.
and can perform a wider range of tasks, including inspection, inventory management, and
even collaborating with human workers. Autonomous Mobile Robots (AMRs) take mobile
robotics a step further by integrating advanced AI and machine learning capabilities. These
robots can learn from their interactions with the environment and continuously improve
their performance. AMRs are highly adaptable and can be easily reconfigured for various
tasks, making them valuable assets in agile manufacturing setups.
However, with the rise of mobile robots in manufacturing, new challenges and issues arise
in the context of human-robot collaboration. Ensuring the safety of human workers sharing
the workspace with mobile robots is paramount. Effective communication between humans
and robots, as well as the development of standardized safety protocols, becomes crucial to
prevent accidents and ensure smooth collaboration. Additionally, addressing concerns about
privacy and data security in connected manufacturing environments is an ongoing challenge.
As the use of mobile robots continues to expand in Industry 4.0, addressing these challenges
May 3, 2024
and leveraging the full potential of IMRs and AMRs is essential for realizing the promise of
efficient and collaborative manufacturing processes.
6. Human-Centric Manufacturing
Human-centric manufacturing, as manifested in the context of Industry 5.0, is a paradigm
shift in industrial production that places humans at the center of manufacturing processes.
Unlike previous industrial revolutions, such as Industry 4.0, which emphasized automation
and the reduction of human involvement, Industry 5.0 acknowledges the unique capabilities
and creativity of human workers and seeks to leverage their skills in conjunction with ad-
vanced technologies. In the Industry 5.0 framework, the human worker is not replaced by
machines but is considered an integral part of the production process. This approach aims
to address several needs and challenges:
1. Social Impact: Industry 5.0 recognizes the social impact of automation and ad-
vanced technologies. It acknowledges the importance of providing meaningful employment
for humans and ensuring that technology benefits society as a whole. [72][73] This alignment
with societal values is crucial for maintaining a harmonious relationship between technology
and human well-being.
2. Sustainability: The sustainability of manufacturing processes is a primary concern
in Industry 5.0. By reintegrating human workers into production, it aims to create more
May 3, 2024
sustainable and environmentally friendly manufacturing practices. This includes reducing
waste, optimizing resource usage, and aligning production with eco-friendly principles.
3. Operator 4.0: The concept of Operator 4.0 is closely related to Industry 5.0. Opera-
tor 4.0 refers to a highly skilled, technologically empowered human operator who collaborates
with advanced machines and systems. It recognizes that the role of human workers is evolv-
ing from manual labor to skilled oversight and management of complex automated processes.
[74][75]
In essence, Industry 5.0 acknowledges that while automation and advanced technologies
are transformative and beneficial, they should not lead to the complete exclusion of humans
from the manufacturing process. Instead, it envisions a future where humans and machines
work in synergy, with each contributing their unique strengths. This approach not only
ensures the well-being of the workforce but also enhances overall manufacturing system
performance, making it more adaptable, sustainable, and aligned with societal values. It
represents a techno-social revolution where technology serves the needs of society while
recognizing the value of human creativity and skills in manufacturing.
May 3, 2024
industrial revolution [16], which moves beyond that of industry 4.0 towards a more sustain-
able manufacturing domain. According to Nahavandi et al. [76], Industry 5.0 will bring
back human workers to the factory floors, by pairing human and machine to leverage hu-
man brainpower and creativity to increase process efficiency by combining workflows and
intelligent systems. It will take the focus off automation, and instead use intelligent systems
to support the synergy between human and machine [3]. This will see a shift away from
CPPS, and more towards human-cyber physical systems [16]. There are many definitions
as to what is Industry 5.0. Outlined in Table 1 [17]. Bringing the human into the loop
provides more fault-tolerance capabilities than fully automated systems on their own [17].
Human-centric manufacturing systems also have the value of improving overall manufac-
turing system performance, while also improving human well-being [17]. Industry 5.0 is
co-existing with Industry 4.0, and can be considered a Techno-Social revolution, that is, a
technology-enabled revolution driven by societal needs [15].
May 3, 2024
enable seamless communication and data exchange. [80][81]
Figure 12: Utilizing a digital twin within an HRC system, where a virtual replica of the physical environment
and robotic components is created, allowing for real-time monitoring, analysis, and simulation to enhance
human-robot interaction and system performance. Derived from [80]
3. Smart Cities: One of the prominent applications of Society 4.0 is the development of
smart cities. These urban environments leverage technology to optimize resource manage-
ment, transportation, energy consumption, and public services. Sensors and data analytics
play a central role in creating more efficient and sustainable cities.
4. Data-Driven Decision-Making: Society 4.0 relies on data analytics and artificial
intelligence to inform decision-making processes. This data-driven approach extends to
various sectors, including healthcare, education, transportation, and governance, to enhance
efficiency and effectiveness.
5. Personalization: Just as Industry 4.0 enables mass customization in manufacturing,
Society 4.0 emphasizes personalization in various services and experiences. Algorithms and
AI systems tailor content, recommendations, and services to individual preferences and
needs.
6. Human Augmentation: Society 4.0 explores technologies that enhance human
capabilities, such as wearable devices, augmented reality (AR), virtual reality (VR), and
brain-computer interfaces. These technologies can improve healthcare, education, and en-
tertainment. [81]
7. Challenges: While Society 4.0 offers numerous opportunities, it also poses challenges
related to privacy, cybersecurity, ethics, and social inequality. The collection and use of vast
amounts of data raise concerns about data protection and surveillance.
May 3, 2024
8. Global Impact: Society 4.0 transcends national borders and has a global impact. It
promotes international collaboration and knowledge sharing, as solutions to complex global
challenges often require collective efforts.
9. Sustainability: Sustainability is a key consideration in Society 4.0, with an emphasis
on eco-friendly technologies and practices. Efforts are made to reduce environmental impact
and address climate change through digital solutions.
10. Inclusive Growth: A central goal of Society 4.0 is to achieve inclusive growth and
address societal disparities. Initiatives focus on bridging the digital divide, ensuring that
technology benefits all segments of the population.
Figure 12 demonstrates the utilization of a digital twin within a Human-Robot Col-
laboration (HRC) system. In this depiction, a virtual replica of the physical environment
and robotic components is created, enabling real-time monitoring, analysis, and simulation.
The digital twin facilitates enhanced human-robot interaction and system performance by
providing a virtual platform for testing and optimization. Through the digital twin, stake-
holders can simulate various scenarios, assess potential risks, and implement optimizations
before deployment in the physical environment. This visualization underscores the trans-
formative potential of digital twins in improving the efficiency, safety, and effectiveness of
HRC systems in diverse industrial applications.
May 3, 2024
Figure 13: The comprehensive exploration of the forms and evolutionary phases of a digital twin, encom-
passing its inception in the design phase, its role during manufacturing, its dynamic engagement in the
operational phase, and its continued significance in maintenance and end-of-life considerations. Derived
from [84]
5. Adaptability: Humans are inherently adaptable and can quickly respond to unex-
pected events and changes in production requirements. They can shift roles, reconfigure
processes, and troubleshoot problems on the fly, making them valuable assets in dynamic
manufacturing environments. [85]
6. Collaboration: Humans collaborate with each other and with machines in man-
ufacturing settings. Human-machine collaboration (HMC) and Human-Robot Collabora-
tion (HRC) are emerging trends that leverage human capabilities alongside automation and
robotics. This collaborative approach aims to improve efficiency and safety in manufactur-
ing.
7. Problem Solving: Manufacturing often presents complex challenges that require
problem-solving skills. Humans analyze production issues, identify root causes, and imple-
ment solutions to enhance efficiency, reduce waste, and address bottlenecks.
8. Continuous Improvement: Humans are at the forefront of continuous improve-
ment efforts in manufacturing. They participate in lean manufacturing practices, Six Sigma
May 3, 2024
initiatives, and other methodologies to streamline processes, reduce defects, and optimize
resource utilization. [86]
9. Innovation: Human creativity and innovation drive the development of new man-
ufacturing technologies and processes. Engineers, designers, and researchers contribute to
the invention of advanced manufacturing techniques, materials, and products.
While automation and Industry 4.0 technologies have introduced greater levels of au-
tomation and digitalization into manufacturing, humans remain integral to the process.
The challenge lies in finding the right balance between automation and human involvement,
with a focus on tasks that leverage human strengths, while automating repetitive, haz-
ardous, or data-intensive tasks. As manufacturing continues to evolve, the role of humans
will adapt and expand, and there will be a growing emphasis on human-centric manufactur-
ing approaches in Industry 5.0, where humans and machines work in symbiosis to achieve
sustainable and efficient production systems. Using both economic and psychological con-
ceptions of humans, Bitsch defines seven different levels where humans can be integrated
into CPPS [20], summarised in Table 1. By considering these levels, it is expected that
human-centric manufacturing systems will better meet the requirements of human-centric
CPPSs. Figure 13 offers a comprehensive exploration of the forms and evolutionary phases
of a digital twin. It spans its inception in the design phase, role in manufacturing, engage-
ment in operations, and significance in maintenance and end-of-life considerations. This
visualization highlights the dynamic nature of digital twins throughout the product lifecy-
cle, underscoring their transformative impact on product development, manufacturing, and
lifecycle management.
6.3. Ergonomics
When discussing the role of humans in the manufacturing process, it is essential to
pay careful attention to the concept of ergonomics. Ergonomics is a multidisciplinary field
that focuses on designing and arranging the elements of a system to optimize human well-
being, performance, and overall interaction with their work environment. In the context
of manufacturing, it plays a critical role in ensuring that human workers can perform their
tasks efficiently, comfortably, and without risking their health. Economics has traditionally
been a driving force behind the design and organization of operator workstations within
the manufacturing industry. This is because optimizing the work environment for human
operators not only enhances their performance but also contributes to cost savings and
productivity improvements. Several key aspects of ergonomics come into play in this context:
1. Furniture Design: Elements such as the height of chairs, tables, and screens are
carefully considered to align with human physiology and anthropometric measurements. Er-
gonomically designed furniture ensures that workers can maintain a comfortable and healthy
posture while performing their tasks, reducing the risk of musculoskeletal disorders and dis-
comfort.
2. Efficiency and Productivity: Ergonomically designed workstations are not just
about comfort; they also lead to increased efficiency and productivity. When the workstation
layout and equipment are optimized for human use, workers can perform their tasks with
May 3, 2024
less fatigue and fewer errors. This, in turn, leads to higher quality output and reduced
downtime. [87][88]
3. Safety: Ensuring that the work environment aligns with ergonomic principles is
crucial for the safety of workers. Inappropriate workstation design can lead to accidents and
injuries. Proper ergonomics reduces the risk of accidents and improves the overall safety of
the manufacturing process.
4. Well-being: Ergonomics goes beyond physical comfort; it also considers the psy-
chological well-being of workers. A well-designed workstation can lead to increased job
satisfaction and reduced stress. This, in turn, can have a positive impact on the morale and
motivation of the workforce.
5. Adaptability: Ergonomics is not a one-size-fits-all concept. It recognizes the diver-
sity in human characteristics and ensures that workstations can be adapted to accommodate
workers with various needs, such as those with disabilities or differing physical characteris-
tics. When integrating humans into the manufacturing process, especially in the context of
"human in the loop," ergonomics becomes an integral part of the design process. It ensures
that the interaction between humans and machines is optimized for both efficiency and the
well-being of the workers. [89] In an era of increasing automation and human-robot collab-
oration, paying attention to ergonomics is not just a matter of comfort; it is a fundamental
component of creating a safe, productive, and sustainable manufacturing environment.
human can operate without endangering each other. Advanced sensors and control systems
are employed to monitor these zones, ensuring that the robot’s movements are controlled to
prevent collisions and maintain the safety of human operators. Moreover, gesture control is
emerging as a promising method for facilitating human-robot collaboration. By using intu-
itive gestures and commands, humans can interact with robots and convey their intentions
effectively. This technology enables seamless communication and coordination between hu-
mans and robots in shared workspaces, further enhancing the collaborative nature of HRC.
As HRC continues to evolve, researchers and manufacturers are exploring innovative ways to
optimize the interaction between humans and robots, aiming to unlock the full potential of
this collaborative approach for advanced manufacturing processes. Figure 14 showcases the
process of recording and capturing data from a collaborative robot (cobot) operating within
a manufacturing environment. Sensors embedded in the cobot monitor its movements, inter-
actions, and performance metrics in real-time. This data logging process provides valuable
insights into the cobot’s behavior and performance, facilitating analysis, optimization, and
troubleshooting. By visualizing this process, stakeholders can enhance cobot functionality
and productivity in manufacturing operations.
May 3, 2024
3. Time Measurement: To accurately measure the time-of-flight, the sensor uses a
high-speed electronic shutter. This shutter opens and closes rapidly, allowing the sensor to
precisely record the time delay between the emitted signal and its return. [96]
4. Depth Calculation: By knowing the speed of light (approximately 299,792,458
meters per second) and the time it took for the signal to bounce back, the sensor calculates
the distance to the object. This process is repeated many times, creating a 3D point cloud
or depth map of the objects in the sensor’s field of view.
5. Applications: 3D Time-of-Flight technology has a wide range of applications. It is
commonly used in robotics for obstacle avoidance and navigation, in automotive for driver
assistance systems, and in consumer electronics for features like facial recognition and gesture
control. It’s also valuable in industrial settings for quality control and object detection.
6. Advantages: ToF sensors offer several advantages, including fast and real-time depth
sensing, accuracy, and the ability to work in various lighting conditions. They are compact
and suitable for integration into various devices. [97]
7. Challenges: While ToF technology is powerful, it may have limitations in terms of
long-range sensing and accuracy in certain conditions, such as extremely bright sunlight.
Additionally, the resolution of ToF sensors may vary, impacting the level of detail in the
generated 3D maps.
May 3, 2024
3. Applications: RGB-D cameras have a wide range of applications across industries,
including:
- Computer Vision: They are used in computer vision tasks like object recognition, track-
ing, and scene understanding.
- Robotics: RGB-D cameras help robots navigate and interact with their surroundings,
avoiding obstacles and grasping objects.
- Gaming: Many gaming consoles and devices use RGB-D cameras for motion sensing,
gesture control, and augmented reality experiences. [99]
- Augmented Reality (AR) and Virtual Reality (VR): AR and VR applications ben-
efit from RGB-D cameras by providing realistic 3D environments and interactions.
- Industrial Automation: RGB-D cameras are used for quality control, object detection,
and assembly line automation.
RGB-D cameras offer several advantages, including real-time depth sensing, versatility in
various lighting conditions, and the ability to capture rich 3D data. They are also relatively
compact and can be integrated into a wide range of devices.
5. Challenges: While RGB-D cameras are powerful, they may have limitations in
terms of range and accuracy, especially in outdoor or bright environments. Additionally,
the quality of depth data can vary among different camera models. In summary, RGB-D
cameras are sensor devices that capture both color (RGB) and depth information from the
surrounding environment. They are valuable tools for a broad spectrum of applications,
from computer vision and robotics to gaming and augmented reality, enabling enhanced
perception and interaction with the physical world.
7.5. LiDAR
LiDAR (Light Detection and Ranging) is an invaluable tool for object detection and
environmental mapping within the manufacturing context. LiDAR sensors operate by emit-
ting laser pulses and measuring the time it takes for these pulses to bounce off surfaces and
return to the sensor. This information is then used to generate detailed point clouds in both
2D and 3D formats, where each point corresponds to a surface that the laser has interacted
with. In manufacturing environments, LiDAR serves various critical purposes:
1. Obstacle Detection and Collision Avoidance: LiDAR sensors are adept at detect-
ing objects, machinery, and potential obstacles in real-time. They are frequently employed
on mobile robotic platforms, ensuring safe navigation and proactive collision avoidance as
robots move through dynamic manufacturing spaces.
2. Simultaneous Localization and Mapping (SLAM): LiDAR plays a pivotal role
in SLAM, a fundamental technique for mobile robotic platforms. By continuously measuring
distances to surrounding objects and surfaces, LiDAR sensors help robots simultaneously
create detailed maps of their environment and determine their precise location within those
maps. This capability is indispensable for autonomous robotic systems, enabling them to
navigate and perform tasks with accuracy. [100][101]
3. Quality Control and Inspection: LiDAR’s ability to capture precise 3D data
makes it invaluable for quality control and inspection processes within manufacturing. It
May 3, 2024
can be used to assess the dimensions, shapes, and surface conditions of products, ensuring
they meet stringent quality standards.
4. Real-time Monitoring: LiDAR sensors can provide real-time monitoring of manu-
facturing processes. By continuously scanning the environment, they enable quick detection
of any deviations or anomalies, allowing for immediate corrective actions and enhancing
process efficiency.
7.6. Radar
RADAR (Radio Detection and Ranging) is another valuable sensing technology used in
manufacturing environments for various applications. Unlike LiDAR, which relies on laser
pulses, RADAR uses radio waves to detect and locate objects. Here’s some key information
about RADAR and its applications in manufacturing:
1. Working Principle: RADAR systems emit radio waves in the form of electromag-
netic signals. These signals travel through the environment until they encounter an object.
When radio waves strike an object, some of the energy is reflected back to the RADAR re-
ceiver. By analyzing the time it takes for the signal to return and its Doppler shift (change in
frequency due to the object’s motion), RADAR systems can determine the distance, speed,
and direction of objects in their vicinity.
2. Object Detection and Tracking: In manufacturing, RADAR is often used for
object detection and tracking. It can identify moving machinery, vehicles, or personnel
within a designated area. This information is crucial for ensuring the safety of workers and
coordinating the movements of autonomous mobile robots and vehicles. [102]
3. Level Sensing: RADAR sensors can be employed for level sensing applications, such
as measuring the fill level of containers or tanks containing liquids or granular materials. This
is important in industries like chemical processing, food and beverage, and pharmaceuticals,
where precise inventory management is essential.
4. Material Characterization: RADAR technology can be used to analyze the com-
position and characteristics of materials, especially in non-destructive testing (NDT) ap-
plications. It can detect defects, voids, or inconsistencies in materials such as composites,
concrete, or metal components, helping to ensure product quality and structural integrity.
5. Environmental Monitoring: RADAR sensors can be used for monitoring environ-
mental conditions, such as measuring air humidity, rainfall, or snowfall. This data can be
valuable in optimizing manufacturing processes and maintaining safe working conditions.
6. Security and Surveillance: RADAR is employed in security systems to detect
and track intruders or unauthorized personnel in manufacturing facilities. It provides an
additional layer of security, especially in large industrial complexes.
7.Navigation and Collision Avoidance: Autonomous vehicles and robots in manu-
facturing benefit from RADAR technology for navigation and collision avoidance. It helps
them detect and respond to obstacles or other moving objects in real-time, ensuring safe
and efficient operation. Overall, RADAR technology adds another dimension to sensing
and monitoring capabilities in manufacturing, offering reliable object detection, tracking,
and environmental data collection that can enhance safety, efficiency, and quality control in
various industrial settings.
May 3, 2024
7.7. Smart Tiles
Smart Tiles is a term commonly used to refer to a type of self-adhesive wall tile designed
for easy and quick installation in various settings, such as kitchens and bathrooms. These
tiles are often used as an alternative to traditional ceramic or glass tiles because they are
generally more affordable, lightweight, and simpler to install. Here’s some information on
smart tiles:
1. Material: Smart tiles are typically made from a variety of materials, including
composite materials, gel components, and high-quality adhesive layers. These materials are
chosen for their durability and ability to withstand moisture and heat, making them suitable
for kitchen and bathroom applications.
2. Appearance: Smart tiles come in a wide range of styles, colors, and patterns,
mimicking the look of traditional tiles. They can resemble subway tiles, mosaic patterns,
or decorative designs. This variety allows homeowners and designers to choose a style that
complements their decor. [103]
3. Installation: One of the key advantages of smart tiles is their ease of installation.
They are self-adhesive, meaning they can be applied directly to a clean, flat surface without
the need for grout or special tools. This DIY-friendly installation process can save both
time and money compared to traditional tile installation.
4. Maintenance: Smart tiles are relatively low-maintenance. They are resistant to
water and heat, which makes them suitable for kitchen backsplashes and shower walls. To
clean them, simply wipe with a damp cloth or a mild cleaning solution.
5. Durability: While smart tiles are durable and long-lasting, they may not be as
resilient as traditional ceramic or glass tiles. They are best suited for areas that don’t
experience heavy wear and tear, such as kitchen backsplashes and bathroom walls. They
may not be recommended for high-traffic flooring applications.
6. Cost: Smart tiles are generally more cost-effective than traditional tiles. They are
a budget-friendly option for those looking to update the appearance of a room without a
significant investment in materials and labor.
7. Removal: If you decide to change your decor or replace the tiles, smart tiles are
usually removable without causing damage to the wall surface. This is another advantage
for DIY enthusiasts.
8. Availability: Smart tiles are readily available in home improvement stores, and there
are also many online retailers that offer a wide selection of styles and colors.
7.8. Wearables
Wearables have found a significant role in enhancing safety across various applications,
leveraging their IoT connectivity and sensor capabilities. These devices are designed to
proactively monitor, detect, and respond to safety-related concerns in real-time. Some key
safety applications of wearables include:
1. Fall Detection: Wearables equipped with accelerometers and gyroscopes can ac-
curately detect sudden falls or impacts. These devices are particularly valuable for elderly
individuals or those at risk of accidents. When a fall is detected, the wearable can send alerts
or notifications to designated contacts or emergency services, ensuring prompt assistance.
May 3, 2024
2. Drowsiness Detection: Some wearables, especially those used in automotive and
industrial settings, incorporate sensors that monitor the wearer’s level of alertness and signs
of drowsiness. These devices can provide timely warnings to prevent accidents caused by
driver fatigue or inattentiveness. [104]
3. Environmental Monitoring: Wearables with environmental sensors can track fac-
tors like air quality, temperature, humidity, and even radiation levels. This data can be
invaluable for workers in hazardous environments, helping them make informed decisions to
protect their health and safety.
4. Emergency Alerts: Wearables can be programmed to trigger emergency alerts in
critical situations. For example, if a wearer is in distress or encounters a dangerous situation,
they can activate an SOS signal that notifies emergency responders or designated contacts
with their precise location.
5. Occupational Safety: In industrial and construction settings, wearables can monitor
workers’ vital signs, ambient conditions, and exposure to hazards. This data can be used
to assess and improve workplace safety, ensuring compliance with safety regulations and
reducing the risk of accidents.
6. Personal Security: Wearables can serve as personal security devices, offering fea-
tures like panic buttons or discreet distress signals. These devices can help individuals,
especially those in vulnerable situations, quickly summon assistance when they feel threat-
ened.
7. Location Tracking: GPS and location tracking sensors in wearables enable real-
time location monitoring. This is particularly valuable for activities like hiking, camping, or
adventure sports, ensuring that users can be located in case of emergencies or getting lost.
[105]
8. Medical Alerts: Some wearables are designed to monitor specific health conditions,
such as heart rate irregularities or glucose levels. In case of health emergencies, these devices
can send alerts to medical professionals or caregivers, enabling swift intervention.
9. Child Safety: Wearable devices for children often include features like geofencing,
allowing parents to define safe boundaries for their kids. If a child crosses these boundaries,
parents receive notifications, enhancing child safety.
10. Occupational Health and Safety (OHS): In the workplace, wearables can be
integrated into OHS programs to monitor and improve employee safety, minimize accidents,
and enhance overall well-being.
May 3, 2024
Autonomous robots use occupancy maps for path planning, obstacle avoidance, and nav-
igation. AR applications use occupancy maps to understand and interact with the real-world
environment. Occupancy maps are often used in simulations for testing and validation of au-
tonomous systems. Intrusion detection systems use occupancy maps to monitor and detect
unauthorized access. Sensor measurements can be noisy, leading to uncertainty in occu-
pancy estimates. Continuous updates are required to maintain an accurate representation
of the environment as it changes. High-resolution maps can be computationally intensive
and may require substantial memory. Occupancy maps are often visualized as grayscale or
color maps, where lighter shades represent lower occupancy probabilities (unoccupied space)
and darker shades represent higher occupancy probabilities (occupied space). Occupancy
maps play a crucial role in the perception and decision-making processes of autonomous sys-
tems, allowing them to operate safely and effectively in complex and dynamic environments.
They are a fundamental component in the field of robotics and computer vision, enabling
machines to understand and navigate the world around them.
8. Discussion
Space-temporal data, also known as spatiotemporal data, is a fascinating and crucial
type of information that plays a significant role in various fields, from scientific research to
practical applications in everyday life. This type of data represents how different variables
or attributes change over both space and time. Let’s delve into a discussion about space-
temporal data, its characteristics, and its significance:
1. Characteristics of Space-Temporal Data:
- Spatial Dimension: Space-temporal data includes information related to different loca-
tions or spatial points. This spatial dimension can be represented in two or three dimensions,
depending on the application. The data also incorporates the aspect of time, indicating how
the attributes or variables change over time. This can range from seconds to years, depending
on the domain.
2. Applications in Various Fields:
- Environmental Monitoring: Spatiotemporal data is crucial in monitoring and managing
environmental phenomena such as climate change, air quality, and natural disasters. For
example, it helps scientists study the movement of weather patterns and track changes in
temperature and precipitation over time and across different regions.
- Transportation and Urban Planning: In urban areas, this data is vital for traffic
management, public transportation optimization, and urban planning. It can help optimize
routes, reduce congestion, and improve the overall transportation system. In healthcare,
spatiotemporal data is used for disease tracking, patient monitoring, and resource allocation.
[110] For instance, it aids in tracking the spread of diseases like COVID-19 and helps hospitals
allocate resources efficiently during emergencies. Farmers use spatiotemporal data to make
informed decisions about crop planting, irrigation, and pest control. It enables precision
agriculture by tailoring actions to specific spatial and temporal conditions.Governments and
organizations use spatiotemporal data to manage and conserve natural resources, such as
May 3, 2024
forests, water bodies, and wildlife. It aids in tracking deforestation, water quality changes,
and wildlife migration patterns.
3. Challenges in Analyzing Space-Temporal Data:
- Data Volume: Space-temporal data can be massive due to the continuous collection
of information over large geographical areas and extended periods. Handling and process-
ing such large volumes of data require robust infrastructure and algorithms. Ensuring the
quality and accuracy of spatiotemporal data can be challenging. Errors in data collection,
sensor malfunctions, and missing values can affect the reliability of analyses. Combining
different sources of space-temporal data, such as satellite imagery, weather stations, and
IoT devices, often requires complex data integration techniques to create a comprehensive
dataset. Analyzing space-temporal data often involves advanced statistical and machine
learning techniques. Researchers and analysts need to account for the spatial and temporal
dependencies in the data.
4. Future Trends:
- AI and Machine Learning: Advances in artificial intelligence and machine learning
have opened up new possibilities for analyzing and predicting spatiotemporal phenomena.
Predictive modeling and anomaly detection are becoming increasingly important. The pro-
liferation of IoT devices and sensor networks is generating vast amounts of real-time space-
temporal data. This trend is likely to continue, enabling more precise monitoring and
decision-making. Visualizing space-temporal data is essential for conveying insights to a
broader audience. Interactive and immersive data visualization tools are emerging to help
individuals and organizations better understand complex spatiotemporal patterns. Using
advanced sensing technologies like 3D RGB-D cameras or convolutional neural networks
(CNNs) with 2D RGB cameras and libraries like OpenPose, it is possible to detect and
record human skeleton data. [111] These technologies enable the calculation of the human
center line coordinate (HCLC), which is a crucial component in human pose estimation. One
promising application of such technology is in the detection of falls, especially in industrial
settings like manufacturing floors. Falls and other occupational incidents pose significant
risks in these environments, and having a robust human detection system that can record
human poses offers several opportunities:
1. Fall Detection: By continuously monitoring the poses of workers on the manufac-
turing floor, the system can identify sudden and abnormal changes in body positions that
indicate a fall. This includes detecting when a worker collapses or loses balance.
2. Collision Detection: In addition to falls, the system can also be configured to
detect collisions or other sudden, unexpected movements. This can help prevent accidents
and injuries resulting from machinery or equipment collisions.
3. Immediate Alarm Trigger: When a fall or collision is detected, the system can
trigger an immediate alarm. This alarm can be both audible and visible, alerting nearby
workers and supervisors to the incident.
4. Faster Assistance: Early detection of falls or incidents is critical for providing
prompt medical assistance. For example, if a worker falls and injures themselves or expe-
riences a medical emergency like a heart attack, the system’s alarm can notify responders
quickly, potentially saving lives or minimizing the severity of injuries.
May 3, 2024
5. Upper Body Coordination: To enhance the accuracy of fall detection, it’s impor-
tant to monitor the upper body coordinates with respect to the HCLC. In the event of a
fall, the upper body’s position will deviate significantly from the norm, allowing for more
precise detection.
6. Data Logging and Analysis: Beyond immediate alerts, the system can log data
for later analysis. This data can be invaluable for investigating incidents, understanding
the causes of falls or collisions, and implementing preventive measures to enhance workplace
safety.
7. Continuous Monitoring: The system can provide 24/7 monitoring, ensuring that
incidents are detected even during non-working hours or when workers are alone on the
manufacturing floor.
However, while the potential benefits of such a system are significant, there are also
important considerations, such as privacy concerns, data security, and the need for worker
consent. It’s crucial to implement these technologies responsibly and in compliance with
relevant regulations and ethical guidelines. Additionally, the system should be designed to
minimize false alarms and be sensitive to different working conditions and environments to
ensure its effectiveness in enhancing workplace safety.
9. Conclusion
The study underscores the critical role of human detection technology in industrial man-
ufacturing, particularly in the context of Human Robot Collaboration (HRC). It serves as
a linchpin for enhancing safety, efficiency, and compliance within modern industrial prac-
tices. Through a thorough examination of the current state of this technology, the following
conclusions can be drawn:
1. Pivotal for Safety and Efficiency: Human detection technology is not only crucial
for ensuring the safety of workers but also for optimizing industrial efficiency. By preventing
accidents and enhancing the interaction between humans and robots, it supports cost savings
and technological progress.
2. Existing Shortcomings: Despite its significance, the technology faces several chal-
lenges. Issues related to accuracy and reliability, the occurrence of false alarms, adaptability
in dynamic environments, and the cost of implementation are current shortcomings that
need to be addressed.
3. Integration in Industry 4.0: The study highlights the importance of integrating
human operators into the Industry 4.0 framework, where advanced automated manufacturing
systems redefine the roles of humans and robots. This integration presents new opportunities
and challenges that require careful examination. [112]
4. Emphasis on 3D Spatial Awareness: In the context of Industry 4.0, 3D spatial
awareness emerges as a pivotal factor for enabling efficient and secure collaboration between
humans and robots within shared workspaces. This underscores the need for advanced sensor
technologies and data processing.
5. Comprehensive Analysis: The paper offers a comprehensive analysis of HRC and
May 3, 2024
Interaction HRI, shedding light on the complexities and nuances of human-robot collabora-
tion within modern manufacturing settings.
6. Technological Exploration: The study extensively explores various technologies
and methodologies for human detection, encompassing both established approaches and
emerging techniques. This in-depth examination reveals the evolving landscape of human
detection technology.
7. Challenges and Opportunities: The multifaceted challenges faced in integrating
humans into industrial settings are balanced with the exciting opportunities for improving
safety and efficiency. This interplay of challenges and opportunities requires ongoing research
and innovation.
8. Reshaping the Future: The paper emphasizes the potential of emerging technolo-
gies to reshape the landscape of human integration in advanced robotics. This suggests that
as technology advances, we can anticipate safer and more efficient manufacturing environ-
ments, ultimately redefining the future of industrial practices.
In conclusion, this paper underscores the significance of human detection technology and
human-robot collaboration within the Industry 4.0 paradigm. It recognizes the challenges
while highlighting the promising avenues for research and application. The future holds the
potential for a transformative impact on industrial manufacturing, making it safer, more
efficient, and adaptable to evolving needs.
11. References
1. Dalenogare, L.S., Benitez, G.B., Ayala, N.F. and Frank, A.G., 2018. The expected
contribution of Industry 4.0 technologies for industrial performance. International
Journal of production economics, 204, pp.383-394.
2. Javaid, M., Haleem, A., Singh, R.P. and Suman, R., 2023. An integrated outlook of
Cyber–Physical Systems for Industry 4.0: Topical practices, architecture, and appli-
cations. Green Technologies and Sustainability, 1(1), p.100001.
3. Nahavandi, S., 2019. Industry 5.0—A human-centric solution. Sustainability, 11(16),
p.4371.
4. Monostori, L., Kádár, B., Bauernhansl, T., Kondoh, S., Kumara, S., Reinhart, G.,
Sauer, O., Schuh, G., Sihn, W. and Ueda, K., 2016. Cyber-physical systems in manu-
facturing. Cirp Annals, 65(2), pp.621-641.
5. Singh, H., 2021. Big data, industry 4.0 and cyber-physical systems integration: A
smart industry context. Materials Today: Proceedings, 46, pp.157-162.
6. Ibarra, D., Ganzarain, J. and Igartua, J.I., 2018. Business model innovation through
Industry 4.0: A review. Procedia manufacturing, 22, pp.4-10.
7. Javaid, M., Haleem, A., Singh, R.P. and Suman, R., 2021. Substantial capabilities of
robotics in enhancing industry 4.0 implementation. Cognitive Robotics, 1, pp.58-75.
May 3, 2024
8. Dixon, J., Hong, B. and Wu, L., 2021. The robot revolution: Managerial and employ-
ment consequences for firms. Management Science, 67(9), pp.5586-5605.
9. Teiwes, J., Bänziger, T., Kunz, A. and Wegener, K., 2016, September. Identifying
the potential of human-robot collaboration in automotive assembly lines using a stan-
dardised work description. In 2016 22nd International Conference on Automation and
Computing (ICAC) (pp. 78-83). IEEE.
10. Gualtieri, L., Rauch, E. and Vidoni, R., 2021. Emerging research fields in safety
and ergonomics in industrial collaborative robotics: A systematic literature review.
Robotics and Computer-Integrated Manufacturing, 67, p.101998.
11. Amorim, A., Guimares, D., Mendona, T., Neto, P., Costa, P. and Moreira, A.P.,
2021. Robust human position estimation in cooperative robotic cells. Robotics and
Computer-Integrated Manufacturing, 67, p.102035.
12. Munaro, M., Lewis, C., Chambers, D., Hvass, P. and Menegatti, E., 2016. RGB-D
human detection and tracking for industrial environments. In Intelligent Autonomous
Systems 13: Proceedings of the 13th International Conference IAS-13 (pp. 1655-1668).
Springer International Publishing.
13. Chemweno, P. and Torn, R.J., 2022. Innovative safety zoning for collaborative robots
utilizing Kinect and LiDAR sensory approaches. Procedia CIRP, 106, pp.209-214.
14. Vogel-Heuser, B. and Hess, D., 2016. Guest editorial Industry 4.0–prerequisites and
visions. IEEE Transactions on automation Science and Engineering, 13(2), pp.411-413.
15. Xu, X., Lu, Y., Vogel-Heuser, B. and Wang, L., 2021. Industry 4.0 and Industry
5.0—Inception, conception and perception. Journal of manufacturing systems, 61,
pp.530-535.
16. Maddikunta, P.K.R., Pham, Q.V., Prabadevi, B., Deepa, N., Dev, K., Gadekallu, T.R.,
Ruby, R. and Liyanage, M., 2022. Industry 5.0: A survey on enabling technologies and
potential applications. Journal of Industrial Information Integration, 26, p.100257.
17. Leng, J., Sha, W., Wang, B., Zheng, P., Zhuang, C., Liu, Q., Wuest, T., Mourtzis, D.
and Wang, L., 2022. Industry 5.0: Prospect and retrospect. Journal of Manufacturing
Systems, 65, pp.279-295.
18. Longo, F., Padovano, A. and Umbrello, S., 2020. Value-oriented and ethical technology
engineering in industry 5.0: A human-centric perspective for the design of the factory
of the future. Applied Sciences, 10(12), p.4182.
19. Battini, D., Berti, N., Finco, S., Zennaro, I. and Das, A., 2022. Towards industry 5.0:
A multi-objective job rotation model for an inclusive workforce. International Journal
of Production Economics, 250, p.108619.
20. Bitsch, G., 2022. Conceptions of man in human-centric cyber-physical production
systems. Procedia CIRP, 107, pp.1439-1443.
21. Wang, L., 2022. A futuristic perspective on human-centric assembly. Journal of Man-
ufacturing Systems, 62, pp.199-201.
22. Li, S., Wang, R., Zheng, P. and Wang, L., 2021. Towards proactive human–robot
collaboration: A foreseeable cognitive manufacturing paradigm. Journal of Manufac-
turing Systems, 60, pp.547-552.
May 3, 2024
23. Wang, H., Lv, L., Li, X., Li, H., Leng, J., Zhang, Y., Thomson, V., Liu, G., Wen, X.,
Sun, C. and Luo, G., 2023. A safety management approach for Industry 5.00 s human-
centered manufacturing based on digital twin. Journal of Manufacturing Systems, 66,
pp.1-12.
24. Fryman, J., 2014, June. Updating the industrial robot safety standard. In
ISR/Robotik 2014; 41st International Symposium on Robotics (pp. 1-4). VDE.
25. Mitka, E., Gasteratos, A., Kyriakoulis, N. and Mouroutsos, S.G., 2012. Safety certifi-
cation requirements for domestic robots. Safety science, 50(9), pp.1888-1897.
26. Harper, C. and Virk, G., 2010. Towards the development of international safety stan-
dards for human robot interaction. International Journal of Social Robotics, 2(3),
pp.229-234.
27. Chemweno, P., Pintelon, L. and Decre, W., 2020. Orienting safety assurance with
outcomes of hazard analysis and risk assessment: A review of the ISO 15066 standard
for collaborative robot systems. Safety Science, 129, p.104832.
28. Reddy, A., Bright, G. and Padayachee, J., 2019. A Review of Safety Methods for
Human-robot Collaboration and a Proposed Novel Approach. ICINCO (1), pp.243-
248.
29. Dian, F.J., Vahidnia, R. and Rahmati, A., 2020. Wearables and the Internet of
Things (IoT), applications, opportunities, and challenges: A Survey. IEEE access,
8, pp.69200-69211.
30. Jeong, S., Kang, S. and Chun, I., 2019, June. Human-skeleton based fall-detection
method using LSTM for manufacturing industries. In 2019 34th International Tech-
nical Conference on Circuits/Systems, Computers and Communications (ITC-CSCC)
(pp. 1-4). IEEE.
31. Arshad, I., KR, R.B., Alsamhi, S.H. and Curry, E., 2023, October. EHRCoI4: A
Novel Framework for Enhancing Human-Robot Collaboration in Industry 4.0. In 2023
3rd International Conference on Emerging Smart Technologies and Applications (eS-
marTA) (pp. 1-6). IEEE.
32. Fu, Y., Chen, J. and Lu, W., 2024. Human-robot collaboration for modular construc-
tion manufacturing: Review of academic research. Automation in Construction, 158,
p.105196.
33. Marinelli, M., 2023. From Industry 4.0 to Construction 5.0: Exploring the Path
towards Human–Robot Collaboration in Construction. Systems, 11(3), p.152.
34. Panagou, S., Neumann, W.P. and Fruggiero, F., 2023. A scoping review of human
robot interaction research towards Industry 5.0 human-centric workplaces. Interna-
tional Journal of Production Research, pp.1-17.
35. Caiazzo, C., Savković, M., Pušica, M., Nikolic, N., Milojevic, D. and Djapan, M., 2023.
Framework of a Neuroergonomic Assessment in Human-Robot Collaboration.
36. Ciccarelli, M., Papetti, A. and Germani, M., 2023. Exploring how new industrial
paradigms affect the workforce: A literature review of Operator 4.0. Journal of Man-
ufacturing Systems, 70, pp.464-483.
May 3, 2024
37. Feddoul, Y., Ragot, N., Duval, F., Havard, V., Baudry, D. and Assila, A., 2023. Ex-
ploring human-machine collaboration in industry: a systematic literature review of
digital twin and robotics interfaced with extended reality technologies. The Interna-
tional Journal of Advanced Manufacturing Technology, pp.1-16.
38. Zhang, Y., Ding, K., Hui, J., Liu, S., Guo, W. and Wang, L., 2024. Skeleton-RGB inte-
grated highly similar human action prediction in human–robot collaborative assembly.
Robotics and Computer-Integrated Manufacturing, 86, p.102659.
39. Caiazzo, C., Savkovic, M., Pusica, M., Milojevic, D., Leva, M.C. and Djapan, M.,
2023. Development of a Neuroergonomic Assessment for the Evaluation of Mental
Workload in an Industrial Human–Robot Interaction Assembly Task: A Comparative
Case Study. Machines, 11(11), p.995.
40. Khosravy, M., Gupta, N., Pasquali, A., Dey, N., Crespo, R.G. and Witkowski, O., 2023.
Human-Collaborative Artificial Intelligence Along With Social Values in Industry 5.0:
A Survey of the State-of-the-Art. IEEE Transactions on Cognitive and Developmental
Systems.
41. Alimam, H., Mazzuto, G., Tozzi, N., Ciarapica, F.E. and Bevilacqua, M., 2023. The
Resurrection of Digital Triplet: A Cognitive Pillar of Human-Machine Integration at
the Dawn of Industry 5.0. Journal of King Saud University-Computer and Information
Sciences, p.101846.
42. Di Pasquale, V., De Simone, V., Giubileo, V. and Miranda, S., 2023. A taxonomy of
factors influencing worker’s performance in human–robot collaboration. IET Collabo-
rative Intelligent Manufacturing, 5(1), p.e12069.
43. Ihrouchen, Z., Souissi, O. and Bensaid, H., 2023, October. Advancing Manufacturing
with Machine Learning: Unlocking the Potential of Reinforcement learning in Industry
4.0. In Industrial Conference of Science and Industry of the Future (ICSIF23).
44. Jacob, F., Grosse, E.H., Morana, S. and König, C.J., 2023. Picking with a robot col-
league: a systematic literature review and evaluation of technology acceptance in hu-
man–robot collaborative warehouses. Computers & Industrial Engineering, p.109262.
45. Zhang, C., Zhou, G., Ma, D., Wang, R., Xiao, J. and Zhao, D., 2023. A deep learning-
enabled human-cyber-physical fusion method towards human-robot collaborative as-
sembly. Robotics and Computer-Integrated Manufacturing, 83, p.102571.
46. Chu, C.H. and Liu, Y.L., 2023. Augmented reality user interface design and experi-
mental evaluation for human-robot collaborative assembly. Journal of Manufacturing
Systems, 68, pp.313-324.
47. Li, S., Zheng, P., Liu, S., Wang, Z., Wang, X.V., Zheng, L. and Wang, L., 2023. Proac-
tive human–robot collaboration: Mutual-cognitive, predictable, and self-organising
perspectives. Robotics and Computer-Integrated Manufacturing, 81, p.102510.
48. Chen, J., Fu, Y., Lu, W. and Pan, Y., 2023. Augmented reality-enabled human-robot
collaboration to balance construction waste sorting efficiency and occupational safety
and health. Journal of Environmental Management, 348, p.119341.
49. Yuan, G., Liu, X., Zhang, C., Pham, D.T. and Li, Z., 2023. A new heuristic algorithm
based on multi-criteria resilience assessment of human–robotcollaboration disassem-
May 3, 2024
bly for supporting spent lithium-ion battery recycling. Engineering Applications of
Artificial Intelligence, 126, p.106878.
50. Terreran, M., Barcellona, L. and Ghidoni, S., 2023. A general skeleton-based action
and gesture recognition framework for human–robot collaboration. Robotics and Au-
tonomous Systems, 170, p.104523.
51. Ghodsian, N., Benfriha, K., Olabi, A., Gopinath, V., Talhi, E., Hof, L.A. and Arnou,
A., 2023. A framework to integrate mobile manipulators as cyber–physical systems into
existing production systems in the context of industry 4.0. Robotics and Autonomous
Systems, 169, p.104526.
52. Ni, S., Zhao, L., Li, A., Wu, D. and Zhou, L., 2023. Cross-View Human Intention
Recognition for Human-Robot Collaboration. IEEE Wireless Communications, 30(3),
pp.189-195.
53. Roda-Sanchez, L., Garrido-Hidalgo, C., García, A.S., Olivares, T. and Fernández-
Caballero, A., 2023. Comparison of RGB-D and IMU-based gesture recognition for
human-robot interaction in remanufacturing. The International Journal of Advanced
Manufacturing Technology, 124(9), pp.3099-3111.
54. Lee, M.L., Liang, X., Hu, B., Onel, G., Behdad, S. and Zheng, M., 2023. A Review of
Prospects and Opportunities in Disassembly with Human-Robot Collaboration. Jour-
nal of Manufacturing Science and Engineering, pp.1-26.
55. Li, C., Zheng, P., Yin, Y., Pang, Y.M. and Huo, S., 2023. An AR-assisted Deep
Reinforcement Learning-based approach towards mutual-cognitive safe human-robot
interaction. Robotics and Computer-Integrated Manufacturing, 80, p.102471.
56. Deng, W., Liu, Q., Zhao, F., Pham, D.T., Hu, J., Wang, Y. and Zhou, Z., 2024.
Learning by doing: A dual-loop implementation architecture of deep active learning
and human-machine collaboration for smart robot vision. Robotics and Computer-
Integrated Manufacturing, 86, p.102673.
57. Soori, M., Arezoo, B. and Dastres, R., 2023. Virtual manufacturing in industry 4.0:
A review. Data Science and Management.
58. Kampa, A., 2023. Modeling and Simulation of a Digital Twin of a Production System
for Industry 4.0 with Work-in-Process Synchronization. Applied Sciences, 13(22),
p.12261.
59. Xian, W., Yu, K., Han, F., Fang, L., He, D. and Han, Q.L., 2023. Advanced Manu-
facturing in Industry 5.0: A Survey of Key Enabling Technologies and Future Trends.
IEEE Transactions on Industrial Informatics.
60. Kwolek, B., 2023. Continuous Hand Gesture Recognition for Human-Robot Collabo-
rative Assembly. In Proceedings of the IEEE/CVF International Conference on Com-
puter Vision (pp. 2000-2007).
61. Entezari, A., Aslani, A., Zahedi, R. and Noorollahi, Y., 2023. Artificial intelligence
and machine learning in energy systems: A bibliographic perspective. Energy Strategy
Reviews, 45, p.101017.
62. Singh, N.K., Yadav, M., Singh, V., Padhiyar, H., Kumar, V., Bhatia, S.K. and Show,
P.L., 2023. Artificial intelligence and machine learning-based monitoring and design
of biological wastewater treatment systems. Bioresource technology, 369, p.128486.
May 3, 2024
63. Liu, L., Guo, F., Zou, Z. and Duffy, V.G., 2024. Application, development and future
opportunities of collaborative robots (cobots) in manufacturing: A literature review.
International Journal of Human–Computer Interaction, 40(4), pp.915-932.
64. Zafar, M.H., Langås, E.F. and Sanfilippo, F., 2024. Exploring the synergies between
collaborative robotics, digital twins, augmentation, and industry 5.0 for smart manu-
facturing: A state-of-the-art review. Robotics and Computer-Integrated Manufactur-
ing, 89, p.102769.
65. Loganathan, A. and Ahmad, N.S., 2023. A systematic review on recent advances in
autonomous mobile robot navigation. Engineering Science and Technology, an Inter-
national Journal, 40, p.101343.
66. Low, E.S., Ong, P. and Low, C.Y., 2023. A modified Q-learning path planning ap-
proach using distortion concept and optimization in dynamic environment for au-
tonomous mobile robot. Computers & Industrial Engineering, 181, p.109338.
67. Upadhyay, A., Balodi, K.C., Naz, F., Di Nardo, M. and Jraisat, L., 2023. Implementing
industry 4.0 in the manufacturing sector: Circular economy as a societal solution.
Computers & Industrial Engineering, 177, p.109072.
68. Bai, C., Zhou, H. and Sarkis, J., 2023. Evaluating Industry 4.0 technology and sus-
tainable development goals–a social perspective. International Journal of Production
Research, 61(23), pp.8094-8114.
69. Barata, J. and Kayser, I., 2023. Industry 5.0–Past, present, and near future. Procedia
Computer Science, 219, pp.778-788.
70. Su, D., Zhang, L., Peng, H., Saeidi, P. and Tirkolaee, E.B., 2023. Technical challenges
of blockchain technology for sustainable manufacturing paradigm in Industry 4.0 era
using a fuzzy decision support system. Technological Forecasting and Social Change,
188, p.122275.
71. Nouinou, H., Asadollahi-Yazdi, E., Baret, I., Nguyen, N.Q., Terzi, M., Ouazene, Y.,
Yalaoui, F. and Kelly, R., 2023. Decision-making in the context of Industry 4.0:
Evidence from the textile and clothing industry. Journal of cleaner production, 391,
p.136184.
72. Zhang, C., Wang, Z., Zhou, G., Chang, F., Ma, D., Jing, Y., Cheng, W., Ding, K.
and Zhao, D., 2023. Towards new-generation human-centric smart manufacturing in
Industry 5.0: A systematic review. Advanced Engineering Informatics, 57, p.102121.
73. Li, X., Nassehi, A., Wang, B., Hu, S.J. and Epureanu, B.I., 2023. Human-centric
manufacturing for human-system coevolution in Industry 5.0. CIRP Annals, 72(1),
pp.393-396.
74. Almeida, J.C., Ribeiro, B. and Cardoso, A., 2023. A human-centric approach to aid
in assessing maintenance from the sustainable manufacturing perspective. Procedia
Computer Science, 220, pp.600-607.
75. Coelho, P., Bessa, C., Landeck, J. and Silva, C., 2023. Industry 5.0: the arising of a
concept. Procedia Computer Science, 217, pp.1137-1144.
76. Golovianko, M., Terziyan, V., Branytskyi, V. and Malyk, D., 2023. Industry 4.0 vs.
Industry 5.0: co-existence, Transition, or a Hybrid. Procedia Computer Science, 217,
pp.102-113.
May 3, 2024
77. Gladysz, B., Tran, T.A., Romero, D., van Erp, T., Abonyi, J. and Ruppert, T., 2023.
Current development on the Operator 4.0 and transition towards the Operator 5.0:
A systematic literature review in light of Industry 5.0. Journal of Manufacturing
Systems, 70, pp.160-185.
78. Golovianko, M., Terziyan, V., Branytskyi, V. and Malyk, D., 2023. Industry 4.0 vs.
Industry 5.0: co-existence, Transition, or a Hybrid. Procedia Computer Science, 217,
pp.102-113.
79. Qahtan, S., Alsattar, H.A., Zaidan, A.A., Deveci, M., Pamucar, D., Delen, D. and
Pedrycz, W., 2023. Evaluation of agriculture-food 4.0 supply chain approaches using
Fermatean probabilistic hesitant-fuzzy sets based decision making model. Applied Soft
Computing, 138, p.110170.
80. Yao, X., Ma, N., Zhang, J., Wang, K., Yang, E. and Faccio, M., 2024. Enhancing
wisdom manufacturing as industrial metaverse for industry and society 5.0. Journal
of Intelligent Manufacturing, 35(1), pp.235-255.
81. Rehman, S.U., Giordino, D., Zhang, Q. and Alam, G.M., 2023. Twin transitions & in-
dustry 4.0: Unpacking the relationship between digital and green factors to determine
green competitive advantage. Technology in Society, 73, p.102227.
82. Zhang, C., Wang, Z., Zhou, G., Chang, F., Ma, D., Jing, Y., Cheng, W., Ding, K.
and Zhao, D., 2023. Towards new-generation human-centric smart manufacturing in
Industry 5.0: A systematic review. Advanced Engineering Informatics, 57, p.102121.
83. Xiong, Y., Tang, Y., Kim, S. and Rosen, D.W., 2023. Human-machine collaborative
additive manufacturing. Journal of Manufacturing Systems, 66, pp.82-91.
84. Haleem, A., Javaid, M., Singh, R.P., Suman, R. and Khan, S., 2023. Management 4.0:
Concept, applications and advancements. Sustainable Operations and Computers, 4,
pp.10-21.
85. Peter, O., Pradhan, A. and Mbohwa, C., 2023. Industry 4.0 concepts within the
sub–Saharan African SME manufacturing sector. Procedia Computer Science, 217,
pp.846-855.
86. Tiwari, S., Bahuguna, P.C. and Srivastava, R., 2023. Smart manufacturing and sus-
tainability: a bibliometric analysis. Benchmarking: An International Journal, 30(9),
pp.3281-3301.
87. Tetteh, E., Wang, T., Kim, J., Smith, T., Norasi, H., Van Straten, M., Lal, G.,
Chrouser, K., Shao, J.M. and Hallbeck, M.S., 2023. Optimizing ergonomics during
open, laparoscopic, and robotic-assisted surgery: A review of surgical ergonomics lit-
erature and development of educational illustrations. The American journal of surgery.
88. Radin Umar, R.Z., Tiong, J.Y., Ahmad, N. and Dahalan, J., 2023. Development of
framework integrating ergonomics in Lean’s Muda, Muri, and Mura concepts. Pro-
duction Planning & Control, pp.1-9.
89. Liao, L., Liao, K., Wei, N., Ye, Y., Li, L. and Wu, Z., 2023. A holistic evaluation of
ergonomics application in health, safety, and environment management research for
construction workers. Safety science, 165, p.106198.
90. Li, W., Hu, Y., Zhou, Y. and Pham, D.T., 2023. Safe human–robot collaboration for
industrial settings: a survey. Journal of Intelligent Manufacturing, pp.1-27.
May 3, 2024
91. Chen, J., Fu, Y., Lu, W. and Pan, Y., 2023. Augmented reality-enabled human-robot
collaboration to balance construction waste sorting efficiency and occupational safety
and health. Journal of Environmental Management, 348, p.119341.
92. Li, C., Zheng, P., Yin, Y., Pang, Y.M. and Huo, S., 2023. An AR-assisted Deep
Reinforcement Learning-based approach towards mutual-cognitive safe human-robot
interaction. Robotics and Computer-Integrated Manufacturing, 80, p.102471.
93. Yu, H., Kamat, V.R., Menassa, C.C., McGee, W., Guo, Y. and Lee, H., 2023. Mu-
tual physical state-aware object handover in full-contact collaborative human-robot
construction work. Automation in Construction, 150, p.104829.
94. Wan, P.K. and Leirmo, T.L., 2023. Human-centric zero-defect manufacturing: State-
of-the-art review, perspectives, and challenges. Computers in Industry, 144, p.103792.
95. Javaid, M., Haleem, A., Singh, R.P. and Suman, R., 2023. An integrated outlook of
Cyber–Physical Systems for Industry 4.0: Topical practices, architecture, and appli-
cations. Green Technologies and Sustainability, 1(1), p.100001.
96. Wang, B., Zhou, H., Li, X., Yang, G., Zheng, P., Song, C., Yuan, Y., Wuest, T., Yang,
H. and Wang, L., 2024. Human Digital Twin in the context of Industry 5.0. Robotics
and Computer-Integrated Manufacturing, 85, p.102626.
97. Malburg, L., Hoffmann, M. and Bergmann, R., 2023. Applying MAPE-K control
loops for adaptive workflow management in smart factories. Journal of Intelligent
Information Systems, 61(1), pp.83-111.
98. Singh, S.A., Kumar, A.S. and Desai, K.A., 2023. Comparative assessment of common
pre-trained CNNs for vision-based surface defect detection of machined components.
Expert Systems with Applications, 218, p.119623.
99. Bärnthaler, R. and Gough, I., 2023. Provisioning for sufficiency: envisaging production
corridors. Sustainability: Science, Practice and Policy, 19(1), p.2218690.
100. Rivera, G., Porras, R., Florencia, R. and Sánchez-Solís, J.P., 2023. LiDAR applications
in precision agriculture for cultivating crops: A review of recent advances. Computers
and Electronics in Agriculture, 207, p.107737.
101. Chen, R., Shu, H., Shen, B., Chang, L., Xie, W., Liao, W., Tao, Z., Bowers, J.E.
and Wang, X., 2023. Breaking the temporal and frequency congestion of LiDAR by
parallel chaos. Nature Photonics, 17(4), pp.306-314.
102. Kim, S.H., Lee, S.Y., Zhang, Y., Park, S.J. and Gu, J., 2023. Carbon-Based Radar Ab-
sorbing Materials toward Stealth Technologies. Advanced Science, 10(32), p.2303104.
103. Bahmanziari, S. and Zamani, A.A., 2024. A new framework of piezoelectric smart
tiles based on magnetic plucking, mechanical impact, and mechanical vibration force
mechanisms for electrical energy harvesting. Energy Conversion and Management,
299, p.117902.
104. Sun, Y., Li, Y.Z. and Yuan, M., 2023. Requirements, challenges, and novel ideas for
wearables on power supply and energy harvesting. Nano Energy, 115, p.108715.
105. Powell, D. and Godfrey, A., 2023. Considerations for integrating wearables into the
everyday healthcare practice. NPJ Digital Medicine, 6(1), p.70.
May 3, 2024
106. Spagnolo, F., Corsonello, P., Frustaci, F. and Perri, S., 2023, September. Approximate
Foveated-Based Super Resolution Method for Headset Displays. In Annual Meeting
of the Italian Electronics Society (pp. 338-344). Cham: Springer Nature Switzerland.
107. Goh, T.L. and Peh, L.S., 2024. WalkingWizard-A truly wearable EEG headset for
everyday use. ACM Transactions on Computing for Healthcare.
108. Wan, W., Sun, W., Zeng, Q., Pan, L. and Xu, J., 2024, January. Progress in ar-
tificial intelligence applications based on the combination of self-driven sensors and
deep learning. In 2024 4th International Conference on Consumer Electronics and
Computer Engineering (ICCECE) (pp. 279-284). IEEE.
109. Li, P.Z.X., Karaman, S. and Sze, V., 2024. GMMap: Memory-Efficient Continuous
Occupancy Map Using Gaussian Mixture Model. IEEE Transactions on Robotics.
110. Abubakar, I.R. and Alshammari, M.S., 2023. Urban planning schemes for developing
low-carbon cities in the Gulf Cooperation Council region. Habitat International, 138,
p.102881.
111. Rahman, M.M. and Thill, J.C., 2023. Impacts of connected and autonomous vehicles
on urban transportation and environment: A comprehensive review. Sustainable Cities
and Society, p.104649.
112. Semeraro, F., Griffiths, A. and Cangelosi, A., 2023. Human–robot collaboration and
machine learning: A systematic review of recent research. Robotics and Computer-
Integrated Manufacturing, 79, p.102432.
113. Robinson, N., Tidd, B., Campbell, D., Kulić, D. and Corke, P., 2023. Robotic vision
for human-robot interaction and collaboration: A survey and systematic review. ACM
Transactions on Human-Robot Interaction, 12(1), pp.1-66.
114. Li, S., Zheng, P., Liu, S., Wang, Z., Wang, X.V., Zheng, L. and Wang, L., 2023. Proac-
tive human–robot collaboration: Mutual-cognitive, predictable, and self-organising
perspectives. Robotics and Computer-Integrated Manufacturing, 81, p.102510.
115. Zhang, M., Xu, R., Wu, H., Pan, J. and Luo, X., 2023. Human–robot collaboration
for on-site construction. Automation in Construction, 150, p.104812.
116. Lee, J., Bennett, C.C., Stanojevic, C., Kim, S., Henkel, Z., Baugus, K., Piatt, J.A.,
Bethel, C. and Sabanovic, S., 2023. Detecting cultural identity via robotic sensor
data to understand differences during human-robot interaction. Advanced Robotics,
37(22), pp.1446-1459.
117. Olugbade, T., He, L., Maiolino, P., Heylen, D. and Bianchi-Berthouze, N., 2023. Touch
technology in affective human–, robot–, and virtual–human interactions: A survey.
Proceedings of the IEEE.
118. Søraa, R.A., Tøndel, G., Kharas, M.W. and Serrano, J.A., 2023. What do older adults
want from social robots? A qualitative research approach to human-robot interaction
(HRI) studies. International Journal of Social Robotics, 15(3), pp.411-424.
119. Mehak, S., Kelleher, J.D., Guilfoyle, M. and Leva, M.C., 2024. Action Recognition
for Human–Robot Teaming: Exploring Mutual Performance Monitoring Possibilities.
Machines, 12(1), p.45.
May 3, 2024
120. Su, H., Qi, W., Chen, J., Yang, C., Sandoval, J. and Laribi, M.A., 2023. Recent
advancements in multimodal human–robot interaction. Frontiers in Neurorobotics,
17, p.1084000.
121. Gongor, F. and Tutsoy, O., 2024. On the Remarkable Advancement of Assis-
tive Robotics in Human-Robot Interaction-Based Health-Care Applications: An Ex-
ploratory Overview of the Literature. International Journal of Human–Computer In-
teraction, pp.1-41.
122. Qi, J., Ma, L., Cui, Z. and Yu, Y., 2024. Computer vision-based hand gesture recog-
nition for human-robot interaction: a review. Complex & Intelligent Systems, 10(1),
pp.1581-1606.
123. Wang, D., Zhang, B., Zhou, J., Xiong, Y., Liu, L. and Tan, Q., 2024. Three-
dimensional mapping and immersive human–robot interfacing utilize Kinect-style
depth cameras and virtual reality for agricultural mobile robots. Journal of Field
Robotics.
124. Jamshad, R., Haripriyan, A., Sonti, A., Simkins, S. and Riek, L.D., 2024, March.
Taking initiative in human-robot action teams: How proactive robot behaviors af-
fect teamwork. In Companion of the 2024 ACM/IEEE International Conference on
Human-Robot Interaction (pp. 559-562).
125. Dihan, M.S., Akash, A.I., Tasneem, Z., Das, P., Das, S.K., Islam, M.R., Islam, M.M.,
Badal, F.R., Ali, M.F., Ahmed, M.H. and Abhi, S.H., 2024. Digital Twin: Data
Exploration, Architecture, Implementation and Future. Heliyon.
126. Khan, T.H., Noh, C. and Han, S., 2023. Correspondence measure: a review for the
digital twin standardization. The International Journal of Advanced Manufacturing
Technology, 128(5-6), pp.1907-1927.
127. Sharma, R. and Gupta, H., 2024. Leveraging cognitive digital twins in industry 5.0 for
achieving sustainable development goal 9: An exploration of inclusive and sustainable
industrialization strategies. Journal of Cleaner Production, p.141364.
128. Arisekola, K. and Madson, K., 2023. Digital twins for asset management: Social
network analysis-based review. Automation in Construction, 150, p.104833.
129. Jørgensen, C.S., Shukla, A. and Katt, B., 2023, September. Digital Twins in Health-
care: Security, Privacy, Trust and Safety Challenges. In European Symposium on
Research in Computer Security (pp. 140-153). Cham: Springer Nature Switzerland.
130. Masi, M., Sellitto, G.P., Aranha, H. and Pavleska, T., 2023. Securing critical infras-
tructures with a cybersecurity digital twin. Software and Systems Modeling, 22(2),
pp.689-707.
131. Soori, M., Arezoo, B. and Dastres, R., 2023. Virtual manufacturing in industry 4.0:
A review. Data Science and Management.
132. Martinetti, A., Nizamis, K., Chemweno, P., Goulas, C., van Dongen, L.A., Gibson, I.,
Thiede, S., Lutters, E., Vaneker, T. and Bonnema, G.M., 2024. More than 10 years of
industry 4.0 in the Netherlands: an opinion on promises, achievements, and emerging
challenges. International Journal of Sustainable Engineering, 17(1), pp.1-12.
May 3, 2024
133. Cole, A., 2023. The Ethics of the Personal Digital Twin. In Human Data Interaction,
Disadvantage and Skills in the Community: Enabling Cross-Sector Environments for
Postdigital Inclusion (pp. 79-92). Cham: Springer International Publishing.
134. Zhao, J., Feng, X., Tran, M.K., Fowler, M., Ouyang, M. and Burke, A.F., 2024. Bat-
tery safety: Fault diagnosis from laboratory to real world. Journal of Power Sources,
598, p.234111.
135. Mallioris, P., Aivazidou, E. and Bechtsis, D., 2024. Predictive maintenance in Industry
4.0: A systematic multi-sector mapping. CIRP Journal of Manufacturing Science and
Technology, 50, pp.80-103.
136. Teicher, U., Ben Achour, A., Selbmann, E., Demir, O.E., Arabsolgar, D., Cassina, J.,
Ihlenfeldt, S. and Colledani, M., 2023, June. The RaRe2 Attempt as a Holistic Plat-
form for Decision Support in Rapidly Changing Process Chains. In Proceedings of the
Changeable, Agile, Reconfigurable and Virtual Production Conference and the World
Mass Customization & Personalization Conference (pp. 347-356). Cham: Springer
International Publishing.
137. Tao, Y., You, S., Zhu, J. and You, F., 2024. Energy, climate, and environmental sus-
tainability of trend toward occupational-dependent hybrid work: Overview, research
challenges, and outlook. Journal of Cleaner Production, p.141083.
138. Belcher, E.J. and Abraham, Y.S., 2023. Lifecycle Applications of Building Information
Modeling for Transportation Infrastructure Projects. Buildings, 13(9), p.2300.
139. BOUABID, D.A., HADEF, H. and INNAL, F., 2024. Maintenance as a Sustainability
Tool in High-Risk Process Industries: a Review and Future Directions. Journal of
Loss Prevention in the Process Industries, p.105318.
140. Kim, H., So, K.K.F., Shin, S. and Li, J., 2024. Artificial intelligence in hospitality and
tourism: Insights from industry practices, research literature, and expert opinions.
Journal of Hospitality & Tourism Research, p.10963480241229235.
141. Kiourtis, A., Mavrogiorgou, A. and Kyriazis, D., 2023, December. A Cross-Sector
Data Space for Correlating Environmental Risks with Human Health. In European,
Mediterranean, and Middle Eastern Conference on Information Systems (pp. 234-247).
Cham: Springer Nature Switzerland.
142. Harrer, S., Menard, J., Rivers, M., Green, D.V., Karpiak, J., Jeliazkov, J.R., Shapo-
valov, M.V., del Alamo, D. and Sternke, M.C., 2024. Artificial intelligence drives the
digital transformation of pharma. In Artificial Intelligence in Clinical Practice (pp.
345-372). Academic Press.
143. Ma, J., Yang, L., Wang, D., Li, Y., Xie, Z., Lv, H. and Woo, D., 2024. Digitalization
in response to carbon neutrality: Mechanisms, effects and prospects. Renewable and
Sustainable Energy Reviews, 191, p.114138.
144. Mouta, A., Pinto-Llorente, A.M. and Torrecilla-Sánchez, E.M., 2023. Uncovering blind
spots in education ethics: Insights from a systematic literature review on artificial
intelligence in education. International Journal of Artificial Intelligence in Education,
pp.1-40.
May 3, 2024
List of Tables
1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
4 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
5 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
6 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
May 3, 2024
Table 1: Review Methodology and Findings: Human Detection Technology in Industrial
Manufacturing for Human-Robot Collaboration
S.No Title Methodology Search Criteria Results
1 Human in the Loop: A Rigorous search January 2009 - September 2020, 9 relevant publications
Review of Human methodology Google Scholar, Scopus, Web of identified
Detection Technology in Science
Industrial Manufacturing
Environments for Human
Robot Collaboration
2 Review Objective Thorough search Keywords: "Human Detection Excluded irrelevant
on prominent web- Technology," "Industrial literature, resulting in 9
based databases Manufacturing," "Human Robot pertinent publications
Collaboration," "Human-Robot
Interaction in Manufacturing"
3 Focus Meticulous Excluded: HRI in industrial Identified literature
selection of search processes, economic evaluations contributing to
terms or feasibility studies of understanding of human
disassembly lines, process detection technology in
optimization of disassembly line industrial manufacturing
designs, design of tools for HRC for HRC
applications
4 Conclusion Comprehensive Emphasized relevance, excluded Aimed to provide
overview and irrelevant papers insights into
analysis autonomous robotic
disassembly systems
and solutions for
industrial disassembly
tasks
Table 2: Principles of Human-Robot Collaboration: Description, Challenges, and Mitigation
Strategies
S.No Principle Description Example Challenges Mitigation
Strategies
1 Mutual Humans and robots Using natural Language barriers, Implementing
Understanding and should comprehend language processing ambiguity in multi-modal
Communication each other's for dialogue. communication. communication
intentions and systems.
communicate
effectively.
2 Safety and Trust Safety protocols must Emergency stop Accidental Rigorous testing,
be in place, and trust buttons and safety collisions, mistrust clear
between human and sensors. in automation. communication of
robot must be safety features.
established.
3 Task Allocation Tasks should be Robots handling Difficulty in task Clearly defined
and Coordination allocated based on repetitive tasks, handover, roles, regular
capabilities, and humans overseeing conflicting coordination
coordination should complex decisions. priorities. meetings.
be seamless.
4 Adaptability and Systems should adapt Robots adjusting to Resistance to Modular system
Flexibility to changing new environments or change, technical design, frequent
conditions, and tasks. limitations. updates and
flexibility should be maintenance.
inherent in roles.
5 Transparency and Operations should be Providing logs of Black box Using interpretable
Explainability transparent, and robot actions for algorithms, lack of models, providing
robots' actions should review. interpretability. transparency
be explainable to reports.
humans.
6 Shared Goals and Humans and robots Collaboratively Misalignment of Regular goal-setting
Objectives should work towards assembling products goals, differing meetings, shared
common objectives, on an assembly line. priorities. performance
aligning their efforts. metrics.
7 Ethical Ethical dilemmas Ensuring robots do Job displacement Ethical frameworks,
Considerations and should be addressed, not replace human fears, ethical involving
Human and human jobs but enhance decision-making. stakeholders in
Empowerment empowerment should productivity. decision-making.
be prioritized.
8 Continuous Systems should learn Using machine Data biases, Diverse datasets,
Learning and from experience and learning to optimize overfitting. regular model re-
Improvement strive for continuous task performance. evaluation.
improvement.
Table 3: Comprehensive Overview of Robot Safety: Aspects, Hazards, and Precautionary
Measures
S.No Safety Aspect Description Potential Hazards Safety Measures
1 Risk Assessment Conducting thorough risk Collision with humans or Utilize risk assessment tools
assessments to identify objects, entrapment, such as ISO 12100, establish
potential hazards and assess electrical hazards. safety zones, implement safety-
their severity. rated sensors.
2 Emergency Stop Implementing mechanisms Delayed response, Regular testing and
Systems for rapid shutdown in improper functioning of maintenance, redundant
emergency situations to emergency stop buttons. emergency stop buttons, clear
prevent harm. signage.
3 Protective Installing physical barriers to Inadequate barrier Regular inspection, adherence
Barriers prevent access to hazardous design, obstruction of to safety standards such as ISO
areas during robot operation. visibility. 10218, warning signs.
4 Safety Interlocks Incorporating interlock Failure of interlock Regular maintenance, testing of
systems to ensure that system, bypassing safety interlock functionality, training
certain conditions are met mechanisms. on proper usage.
before operation.
5 Human-Robot Designing robots capable of Collision with humans, Use of collaborative robots
Collaboration safely interacting with misinterpretation of (cobots), implementation of
humans, including collision human gestures. safety-rated sensors, employee
avoidance features. training.
6 Training and Providing comprehensive Lack of training, operator Regular training sessions,
Education training to operators on safe error. simulation of emergency
robot operation and scenarios, clear operating
emergency procedures. manuals.
7 Safety Standards Ensuring that robots adhere Non-compliance with Regular audits, certification by
Compliance to established safety safety standards, outdated recognized safety
standards and regulations. safety features. organizations, staying informed
about updates to safety
standards.
8 Preventive Conducting routine Equipment malfunction Establishing maintenance
Maintenance maintenance to identify and due to neglect, failure to schedules, regular inspection of
address potential safety identify wear and tear. equipment, recording
hazards before they occur. maintenance activities.
9 Hazard Clearly communicating Inadequate signage, Multilingual signage, use of
Communication hazards associated with language barriers. universal symbols, regular
robot operation through safety briefings.
signage and labels.
10 Continuous Implementing processes for Failure to address Establishing safety committees,
Improvement ongoing evaluation and emerging hazards, conducting regular safety
improvement of robot safety complacency. audits, encouraging feedback
measures. from employees.
Table 4: Advancing Safety and Interaction in Collaborative Workspaces: Strategies for Control
and Human-Robot Communication
S.No Aspect Description Challenges Control Strategies and
Communication Techniques
1 Risk Assessment Conducting thorough risk Complexity of Utilizing real-time risk
assessments to identify potential collaborative assessment algorithms,
hazards and prioritize safety environments, dynamic integrating safety feedback
measures. risk factors. mechanisms.
2 Human-Robot Designing collaborative robots Ensuring seamless Implementing force and
Collaboration capable of safely interacting interaction without torque sensors, designing
with humans while maintaining compromising safety. intuitive human-robot
productivity. interfaces.
3 Safety Interlocks Incorporating interlock systems Delayed response time, Introducing redundant safety
and Emergency and emergency stop balancing safety with systems, integrating
Stop mechanisms to ensure rapid operational efficiency. predictive emergency stop
shutdown in hazardous algorithms.
situations.
4 Real-Time Employing sensors and Data overload, ensuring Utilizing AI-driven anomaly
Monitoring and monitoring systems to provide accuracy and reliability detection, visual and auditory
Feedback real-time feedback on of sensor data. feedback mechanisms.
environmental changes and
hazards.
5 Training and Providing comprehensive Knowledge transfer, Simulated training scenarios,
Education training programs to both ensuring training interactive learning
humans and robots for safe and relevance in dynamic platforms, continuous skills
effective collaboration. environments. development.
Table 5: Exploring Robotics: Studies in Robot Models, Sensor Technologies, and Human-Robot
Interaction
S.No Authors Robot Model Sensor Technology AI/Algorithm Challenges Brief Description
Used
1 Semeraro Not specified Microsoft KinectTM Proximity Query Environmental noise Developed controls
et al. 2023 Package affecting sensor for Cyber-Physical
[112] accuracy. Systems in
collaborative
environments.
2 Robinson UR-5/UR3 Microsoft KinectTM Robotiq Data interoperability Proposed framework
et al. 2023 and integration for digital twin
[113] challenges. utilization in
assembly
workstations.
3 Li et al. UR-5 Microsoft Amazon Sumerian Real-time Introduced virtual
2023 [114] KinectTM/Photosensors synchronization of assembly
virtual and physical workstations with
assembly. real-time feedback.
4 Zhang et ABB IRB Infrared Sensors Savitzky–Golay Noise filtering in Improved human-
al. 2023 2600 Filtering Method dynamic industrial robot interaction
[115] environments. accuracy with sensor
data filtering.
5 Lee et al. SCARA Not specified Not specified Interoperability Addressing
2023 Adept 604-S/ between diverse compatibility issues
[116] KUKA robot platforms. in multi-robot
KR30–3/ environments.
Nachi MZ07
6 Olugbade COMAU Kinect Azure Convolutional Real-time data Implementing VR
et al. 2023 AURA Neural Networks processing for technologies for
[117] (Actions Perception seamless VR enhanced task
Module) experience. perception.
7 Søraa et al. Not specified LIDAR Collision Detection Real-time collision Developing digital
2023 detection in dynamic twins for safety-
[118] environments. critical applications.
8 Mehak et Not specified OptiTrack Motion Estimation Real-time motion Analyzing
al. 2024 tracking in complex ergonomic factors
[119] work environments. and motion in
collaborative tasks.
9 Su et al. KUKA IIWA Microsoft Hololens 2 Deep Conventional Real-time AR Integrating AR
2023 Neural Network visualization in technologies for
[120] dynamic improved
workspaces. collaboration.
10 Gongor Not specified Microsoft Not specified Not specified Not specified
and Tutsoy KinectTM/LIDAR/Light
(2024) barrier
[121]
11 Qi et al. UR-10 Microsoft HoloLens 2 Not specified Not specified Not specified
2024
[122]
12 Wang et al. Not specified Head-Mounted Display Not specified Not specified Not specified
2024
[123]
Table 6: Insights into Digital Twins: Sector-Specific Applications, Challenges, and Simulation
Parameters
S.No Study Industry Operation Type Simulation Insights Challenges
Sector
1 Sharma and Manufacturing Gesture Controlled simulation Data interoperability,
Gupta, (2024) Recognition environment for modeling Real-time processing
[124] human gestures.
2 Arisekola and Industrial Assembly Controlled simulation Model accuracy,
Madson, environment for what-if Integration with
(2023) [125] analysis in assembly existing systems
processes.
3 Jørgensen et al. Assembly Layout Design Controlled simulation Real-time monitoring,
[126] Line environment for workstation Predictive
layout design in assembly maintenance
lines.
4 Masi et al. Manufacturing Safety Assessment Controlled simulation Data integration, IoT
[127] environment for evaluating integration
workplace safety in
manufacturing settings.
5 Soori et al. Automation Robot Controlled simulation Model accuracy, Real-
[128] Programming environment for programming time processing
robots in automated systems.
6 Martinetti et al. Production Ergonomic Realistic simulation Data interoperability,
[129] Assessment environment for assessing Real-time monitoring
ergonomic factors in
production settings.
7 Cole, 2023 Manufacturing Safety Assessment Controlled simulation Model accuracy,
[130] environment for analyzing Integration with
workplace safety in existing systems
manufacturing environments.
8 Zhao et al. Industrial Ergonomic Controlled simulation Real-time monitoring,
[131] Assessment environment for evaluating Data integration
ergonomic aspects of
industrial workstations.
9 Mallioris et al. Industrial Communication Controlled simulation Real-time processing,
[132] Enhancement environment for enhancing Integration with
communication between existing systems
human operators and robots.
10 Teicher et al. Production Safety Assessment Controlled simulation Data interoperability,
[133] environment for assessing Model accuracy
workplace safety in
production facilities.
11 Tao et al. [134] Manufacturing Task Planning Controlled simulation Real-time monitoring,
environment for optimizing Data integration
workflow and task planning
in manufacturing.
12 Belcher and Automation Safety Assessment Controlled simulation Data integration, IoT
Abraham, 2023 environment for evaluating integration
[135] workplace safety in
automated systems.
13 BOUABID et Food Task Allocation Controlled simulation Model accuracy, Real-
al. [136] Processing environment for optimizing time processing
task allocation in food
processing facilities.
14 Kim et al. Industrial Path Movement Controlled simulation Data interoperability,
[137] environment for planning Real-time monitoring
robotic path movements in
industrial settings.
15 Kiourtis et al. Industrial Cognitive Controlled simulation Integration with
[138] Ergonomic environment for assessing existing systems,
Assessment cognitive ergonomic factors Real-time processing
in industrial workspaces.
16 Harrer et al. Manufacturing Safety Assessment Controlled simulation Data integration,
[139] environment for analyzing Model accuracy
workplace safety in
manufacturing environments.
17 Ma et al. [140] Automation Task Allocation Controlled simulation Real-time monitoring,
environment for optimizing Data interoperability
task allocation in automated
systems.
18 Mouta et al. Automotive Safety Assessment Controlled simulation Real-time processing,
[141] environment for evaluating Integration with
vehicle safety in automotive existing systems
manufacturing.
19 Jamshad et al. Industrial Safety Assessment Controlled simulation Data integration,
[142] environment for assessing Model accuracy
workplace safety in industrial
settings.