Unmanned Systems Engineering
87 Followers
Recent papers in Unmanned Systems Engineering
Several Technical visits of the Egbin Thermal Station was carried out and the energy records for a fifteen month period taken. The renewable energy potentials with respect to wind and solar energy of the site as well as energy audit and... more
Several Technical visits of the Egbin Thermal Station was carried out and the energy records for a fifteen month period taken. The renewable energy potentials with respect to wind and solar energy of the site as well as energy audit and optimization for the plant’s cardinal sections, were carried out and a techno-economic analysis of the wind-solar PV made. The Wind-Solar-Grid hybrid energy system based on a pilot design for wind turbines and solar PV for the plant was simulated using Homer renewable energy software. The integration of wind-solar hybrid energy system to the thermal plant significantly increased the overall power generation to the grid with great economic prospects and capital recovery of less than four years which further reduce as the cost of electricity with increase in generation capacity. The Homer software simulation showed great accuracy and agreement with actual calculations.
Modern Remotely Piloted Aircraft Systems (RPAS) employ a variety of sensors and multi-sensor data fusion techniques to provide advanced functionalities and trusted autonomy in a wide range of mission-essential and safety-critical tasks.... more
Modern Remotely Piloted Aircraft Systems (RPAS) employ a variety of sensors and multi-sensor data fusion techniques to provide advanced functionalities and trusted autonomy in a wide range of mission-essential and safety-critical tasks. In particular, Navigation and Guidance Systems (NGS) for small RPAS require a typical combination of lightweight, compact and inexpensive sensors to satisfy the Required Navigation Performance (RNP) in all flight phases. In this paper, the synergies attainable by the combination of Global Navigation Satellite System (GNSS), Micro-Electromechanical System based Inertial Measurement Unit (MEMS-IMU) and Vision-Based Navigation (VBN) sensors are explored. In case of VBN, an appearance-based navigation technique is adopted and feature extraction/optical flow methods are employed to estimate the navigation parameters during precision approach and landing phases. A key novelty of the proposed approach is the employment of Aircraft Dynamics Models (ADM) augmentation to compensate for the shortcomings of VBN and MEMS-IMU sensors in high-dynamics attitude determination tasks. To obtain the best estimates of Position, Velocity and Attitude (PVA), different sensor combinations are analysed and dynamic Boolean Decision Logics (BDL) are implemented for data selection before the centralised data fusion is accomplished. Various alternatives for data fusion are investigated including a traditional Extended Kalman Filter (EKF) and a more advanced Unscented Kalman Filter (UKF). A novel hybrid controller employing fuzzy logic and Proportional-Integral-Derivative (PID) techniques is implemented to provide effective stabilization and control of pitch and roll angles. After introducing the key mathematical models describing the three NGS architectures: EKF based VBN-IMU-GNSS (VIG) and VBN-IMU-GNSS-ADM (VIGA) and UKF based Enhanced VIGA (EVIGA), the system performances are compared in a small RPAS integration scheme (i.e., AEROSONDE RPAS platform) exploring a representative cross-section of the aircraft operational flight envelope. A dedicated ADM processor (i.e., a local pre-filter) is adopted in the EVIGA architecture to account for the RPAS maneuvering envelope in different flight phases (assisted by a maneuver identification algorithm), in order to extend the ADM validity time across all segments of the RPAS trajectory. Simulation results show that the VIG, VIGA and EVIGA systems are compliant with ICAO requirements for precision approach down to CAT-II. In all other flight phases, the VIGA system shows improvement in PVA data output with respect to the VIG system. The EVIGA system shows the best performance in terms of attitude data accuracy and a significant extension of the ADM validity time is achieved in this configuration.
The Autonomy Levels for Unmanned Systems (ALFUS) Ad Hoc Workgroup started off as a National Institute of Standards and Technology (NIST) sponsored effort and was participated by Government Labs, developers, users, and contractors of... more
The Autonomy Levels for Unmanned Systems (ALFUS) Ad Hoc Workgroup started off as a National Institute of Standards and Technology (NIST) sponsored effort and was participated by Government Labs, developers, users, and contractors of various Unmanned Systems (UMS) programs. The participants have formed close collaborative relationships, including the U.S. Army Future Combat System (FCS) User community and the Lead System Integrator (LSI) and its contractors. The Workgroup later migrated into SAE International as a standards subcommittee, AS4D, under the Aerospace Avionic Systems Group, AS-4 Unmanned Systems Steering Committee.
ALFUS aims at formulating, through a consensus-based approach, a logical framework for characterizing the UMS autonomy, covering issues of levels of autonomy, mission complexity, and environmental complexity. The Framework is to provide standard definitions, metrics, and process for the specification, evaluation, and development of the autonomous capabilities of UMSs. The Framework is also intended to facilitate communication among the practitioners.
ALFUS aims at formulating, through a consensus-based approach, a logical framework for characterizing the UMS autonomy, covering issues of levels of autonomy, mission complexity, and environmental complexity. The Framework is to provide standard definitions, metrics, and process for the specification, evaluation, and development of the autonomous capabilities of UMSs. The Framework is also intended to facilitate communication among the practitioners.
UAVs are part of common lexicon and very soon would be part of daily life. Till the very recent past, they flew in segregated airspace under the control of the military and for military tasks. As they proliferate in the 'civilian' domain,... more
UAVs are part of common lexicon and very soon would be part of daily life. Till the very recent past, they flew in segregated airspace under the control of the military and for military tasks. As they proliferate in the 'civilian' domain, this essay looks at the immense challenges in integrating them in national airspace for unhindered everyday operations, like commercial aviation. As UCAVs mature in the coming years, integrating them would be an even bigger issue.
The introduction of dedicated software functions for separation assurance and collision avoidance in Next Generation Flight Management Systems (NG-FMS) has the potential to enable significant advances in the Unmanned Aircraft System (UAS)... more
The introduction of dedicated software functions for separation assurance and collision avoidance in Next Generation Flight Management Systems (NG-FMS) has the potential to enable significant advances in the Unmanned Aircraft System (UAS) Traffic Management (UTM) operational context. In this paper, key elements of the NG-FMS architecture are presented that allow planning and optimisation of 4-dimensional trajectories. The NG-FMS is designed to be fully interoperable with a future ground based 4DT Planning, Negotiation and Validation (4-PNV) system, enabling automated Trajectory/Intent-Based Operations (TBO/IBO). This paper addresses one of the key technological challenges for integrating UAS in non-segregated airspace by implementing suitable hardware and software (data fusion) techniques for cooperative and non-cooperative separation assurance and collision avoidance tasks. The sensor/system providing the most reliable separation maintenance and collision avoidance solution is automatically selected and this approach provides robustness in all flight phases supporting all-weather and trusted autonomous operations. The mathematical algorithms employed in the unified approach to cooperative and non-cooperative separation assurance and collision avoidance scenarios are presented. In this method, navigation and tracking errors affecting the host platform and intruder sensor measurements are translated to unified range and bearing uncertainty descriptors. Simulation case studies are presented, including UTM elements such as dynamic geo-fencing, and the results corroborate the validity of separation assurance and collision avoidance algorithms for the considered mission-and safety-critical tasks.
Unmanned Aircraft System (UAS) navigation in urban environments using Global Navigation Satellite System (GNSS) as a primary sensor is limited in terms of accuracy and integrity due to the presence of antenna masking and signal multipath... more
Unmanned Aircraft System (UAS) navigation in urban environments using Global Navigation Satellite System (GNSS) as a primary sensor is limited in terms of accuracy and integrity due to the presence of antenna masking and signal multipath effects. In this paper, a GNSS Aircraft-Based Integrity Augmentation (ABIA) system is presented. This system relies on detailed modeling of signal propagation and multipath effects to produce predictive and reactive alerts (cautions and warnings) in urban environments. The model predictive capability is then used to augment path-planning functionalities in the UAS Traffic Management (UTM) context. The models of the presented system are corroborated by performing simulation case studies in typical urban canyons, wherein positioning integrity is degraded by multipath and masking.
We would like to invite you to join this exciting new project as a chapter contributor on one of the topics listed below. Since this is a textbook, a great deal of this chapter entails a survey on the topic under the paradigm of... more
We would like to invite you to join this exciting new project as a chapter contributor on one of the topics listed below. Since this is a textbook, a great deal of this chapter entails a survey on the topic under the paradigm of cyber-physical systems, what can be done onboard and remotely, the distributed nature of the system and some exercises on futurology (anticipating trends can shed some light on upcoming designs). IET will bring great visibility to your work. Each chapter should be around 20-25 pages each and can be submitted as a Word or Latex File. The IET will send you additional info (formatting, permission form, etc.) with the contributor's agreement once you have decided to contribute to the book. Visit http://www.theiet.org/resources/author-hub/books/index.cfm to get all contributor's information to an IET research-level book. Each book is expected to have a total number of 500 printed pages (with approximately 550 words per page and a 20% allowance for figures and tables). We have included a tentative schedule and list of topics below. If this is something you would consider, please send us the title of your chapter, a short description/abstract of the chapter content, and your full contact details. We will expect original content and new insights for this book. You can, of course, reuse published material but the percentage of material reuse for the chapter should be less than 40%. The IET will run a piracy software on the full manuscript to control that you are including original material and will reject chapters who contain a large amount of already-published material so please do take this into consideration. We would appreciate your feedback by December 31, 2017. Please do not hesitate to contact us if you have any queries. We look forward to working with you towards the successful publication.
IET indexes its books and journal in SCOPUS and IEEE Xplore. Computer Vision (CV) and Sensors play a decisive role in the operation of Unmanned Aerial Vehicle (UAV), but there exists a void when it comes to analysing the extent of their... more
IET indexes its books and journal in SCOPUS and IEEE Xplore.
Computer Vision (CV) and Sensors play a decisive role in the operation of Unmanned Aerial Vehicle (UAV), but there exists a void when it comes to analysing the extent of their impact on the entire UAV system. In general, the fact that a UAV is a Cyber-Physical System (CPS) is not taken into account. In this proposal, we propose to expand on earlier books covering the use of CV and sensing in UAVs. Among other things, an entirely autonomous UAV can help to (i) obtain information about the environment, (ii) work for an extended period of time without human interference, (iii) move either all or part of itself all over its operating location devoid of human help and (iv) stay away from dangerous situations for people and their possessions. A Vision System (VS) entails the way CV data will be utilized, the appropriate architecture for total avionics integration, the control interfaces, and the UAV operation. Since the VS core is its sensors and cameras, multi-sensor fusion, navigation, hazard detection, and ground correlation in real time are important operational aspects that can benefit from CV knowledge and technology. This book will aim to collect and shed some light on the existing information on CV software and hardware for UAVs as well as pinpoint aspects that need additional thinking. It will list standards and a set of prerequisites (or lack of them thereof) when it comes to CV deployment in UAVs. The issue of data fusion takes a centre place when the book explores ways to deal with sensor data and images as well as their integration and display. The best practices to fuse image and sensor information to enhance UAV performance by means of CV can greatly improve all aspects of the corresponding CPS. The CPS viewpoint can improve the way UAVs interact with the Internet of Things (IoT), use cloud computing, meet communications requirements, implement hardware/software paradigms necessary to handle video streaming, incorporate satellite data, and combine CV with Virtual/Augmented Realities.
VOLUME 2-DEPLOYMENT AND APPLICATIONS: This tome introduces procedures, standards, and prerequisites for the deployment of Computer Vision (CV) in UAVs from their application point of view. It discusses existing/desirable open source software tools, image banks, benchmarks, Quality of Experience (QoE), Quality of Service (QoS) and how CV can benefit from a Robot Operating System (ROS) in surveillance, remote sensing, inspection, maintenance and repair among other usages, while offering an assessment of current bottlenecks and trends. It will pave the road towards better studies on the necessity and viability of implementing collaborative environments for visualization, knowledge management and teleoperation of UAVs. This is planned to be the companion volume of Estrela, Hemanth, Saotome (Eds) / Imaging and Sensing for Unmanned Aerial Vehicles: Volume 1-Control and Performance.
Editor(s):
Dr. Vania V. Estrela, https://www.linkedin.com/in/vania-v-estrela-96b9bb29/
Universidade Federal Fluminense (UFF), RJ, Brazil
vania.estrela.phd@ieee.org
Dr. Jude Hemanth,https://www.karunya.edu/ece/drjude.html
Karunya University, Coimbatore, India
jude_hemanth@rediffmail.com
Dr. Osamu Saotome, https://www.linkedin.com/in/osamu-saotome-83935818
Instituto Tecnológico de Aeronáutica, CTA-ITA-IEEA, São José dos Campos, SP, Brazil
osaotome@gmail.com
CONTENTS:
1. Image Acquisition and Restoration in UAVs
2. Image Fusion in UAVs
3. Super-Resolution Imaging in UAVs
4. 2D/3D/4D Imaging in UAVs
5. Multi-view Image and ToF Sensor Fusion in UAVs
6. Range Imaging in UAVs
7. Multispectral and Hyperspectral Imaging in UAVs
8. Imaging Standards and UAVs
9. Virtual/Augmented Reality in UAVs
10. Collaborative Environments in UAVs
11. Archiving, Storage, and Compression in UAVs
12. Analysis, Indexing, Retrieval in UAVs
13. Multicast/Broadcast/Streaming in UAVs
14. Modelling, Simulation and UAVs
15. Image-Oriented Estimation and Identification in UAVs
16. Open Source Software in UAVs
17. Image Banks and Benchmarks in UAVs
18. Quality of Experience (QoE) and Quality of Service (QoS) in UAVs
19. Robot Operating System (ROS) in UAVs
20. Cloud Computing in UAVs
Specification and Schedule
July 1st, 2017: Call for Chapter Abstracts
September 1, 2017: One-Page Chapter Abstract (up to 1000 words) Submission Deadline. Free style.
A proposal must outline one of topics fromthe list above (mention its number, for instance 1. and reference PBCE120B).
November 30, 2017: Last Day for Notification of Acceptance
Jan 30, 2018: Full Chapter Submissions
March 30 2018: Review Chapter Submissions and send comments to authors
May 31, 2018: Receive revised Chapter Submissions
June 30, 2018: Notification of Final Acceptance
July 31, 2018: Gather all material, figure files and copyrights permission forms
Aug 30, 2018: Book editors to finalize introduction and conclusion chapters
Sept 15th, 2018, Delivery of full manuscript to the IET
Scheduled publication: Feb/March 2019
Readership: Graduate students and Researchers in the fields of Electrical and Computer Engineering, Computer Science, Mechanical Engineering, Civil Engineering, Humanitarian Engineering, Control Systems, Geoscience and Remote Sensing, Instrumentation and Measurement, Intelligent Transportation Systems, Oceanic Engineering, Safety Engineering, Reliability, Robotics and Automation, Signal Processing, Technology and Engineering Management, Environmental Engineering, Public Health Management, Non-Invasive Testing/Monitoring and Vehicular Technology.
Additional Information: Dr. Vania V. Estrela, vania.estrela.phd@ieee.org
Dr. Jude Hemanth, jude_hemanth@rediffmail.com
Computer Vision (CV) and Sensors play a decisive role in the operation of Unmanned Aerial Vehicle (UAV), but there exists a void when it comes to analysing the extent of their impact on the entire UAV system. In general, the fact that a UAV is a Cyber-Physical System (CPS) is not taken into account. In this proposal, we propose to expand on earlier books covering the use of CV and sensing in UAVs. Among other things, an entirely autonomous UAV can help to (i) obtain information about the environment, (ii) work for an extended period of time without human interference, (iii) move either all or part of itself all over its operating location devoid of human help and (iv) stay away from dangerous situations for people and their possessions. A Vision System (VS) entails the way CV data will be utilized, the appropriate architecture for total avionics integration, the control interfaces, and the UAV operation. Since the VS core is its sensors and cameras, multi-sensor fusion, navigation, hazard detection, and ground correlation in real time are important operational aspects that can benefit from CV knowledge and technology. This book will aim to collect and shed some light on the existing information on CV software and hardware for UAVs as well as pinpoint aspects that need additional thinking. It will list standards and a set of prerequisites (or lack of them thereof) when it comes to CV deployment in UAVs. The issue of data fusion takes a centre place when the book explores ways to deal with sensor data and images as well as their integration and display. The best practices to fuse image and sensor information to enhance UAV performance by means of CV can greatly improve all aspects of the corresponding CPS. The CPS viewpoint can improve the way UAVs interact with the Internet of Things (IoT), use cloud computing, meet communications requirements, implement hardware/software paradigms necessary to handle video streaming, incorporate satellite data, and combine CV with Virtual/Augmented Realities.
VOLUME 2-DEPLOYMENT AND APPLICATIONS: This tome introduces procedures, standards, and prerequisites for the deployment of Computer Vision (CV) in UAVs from their application point of view. It discusses existing/desirable open source software tools, image banks, benchmarks, Quality of Experience (QoE), Quality of Service (QoS) and how CV can benefit from a Robot Operating System (ROS) in surveillance, remote sensing, inspection, maintenance and repair among other usages, while offering an assessment of current bottlenecks and trends. It will pave the road towards better studies on the necessity and viability of implementing collaborative environments for visualization, knowledge management and teleoperation of UAVs. This is planned to be the companion volume of Estrela, Hemanth, Saotome (Eds) / Imaging and Sensing for Unmanned Aerial Vehicles: Volume 1-Control and Performance.
Editor(s):
Dr. Vania V. Estrela, https://www.linkedin.com/in/vania-v-estrela-96b9bb29/
Universidade Federal Fluminense (UFF), RJ, Brazil
vania.estrela.phd@ieee.org
Dr. Jude Hemanth,https://www.karunya.edu/ece/drjude.html
Karunya University, Coimbatore, India
jude_hemanth@rediffmail.com
Dr. Osamu Saotome, https://www.linkedin.com/in/osamu-saotome-83935818
Instituto Tecnológico de Aeronáutica, CTA-ITA-IEEA, São José dos Campos, SP, Brazil
osaotome@gmail.com
CONTENTS:
1. Image Acquisition and Restoration in UAVs
2. Image Fusion in UAVs
3. Super-Resolution Imaging in UAVs
4. 2D/3D/4D Imaging in UAVs
5. Multi-view Image and ToF Sensor Fusion in UAVs
6. Range Imaging in UAVs
7. Multispectral and Hyperspectral Imaging in UAVs
8. Imaging Standards and UAVs
9. Virtual/Augmented Reality in UAVs
10. Collaborative Environments in UAVs
11. Archiving, Storage, and Compression in UAVs
12. Analysis, Indexing, Retrieval in UAVs
13. Multicast/Broadcast/Streaming in UAVs
14. Modelling, Simulation and UAVs
15. Image-Oriented Estimation and Identification in UAVs
16. Open Source Software in UAVs
17. Image Banks and Benchmarks in UAVs
18. Quality of Experience (QoE) and Quality of Service (QoS) in UAVs
19. Robot Operating System (ROS) in UAVs
20. Cloud Computing in UAVs
Specification and Schedule
July 1st, 2017: Call for Chapter Abstracts
September 1, 2017: One-Page Chapter Abstract (up to 1000 words) Submission Deadline. Free style.
A proposal must outline one of topics fromthe list above (mention its number, for instance 1. and reference PBCE120B).
November 30, 2017: Last Day for Notification of Acceptance
Jan 30, 2018: Full Chapter Submissions
March 30 2018: Review Chapter Submissions and send comments to authors
May 31, 2018: Receive revised Chapter Submissions
June 30, 2018: Notification of Final Acceptance
July 31, 2018: Gather all material, figure files and copyrights permission forms
Aug 30, 2018: Book editors to finalize introduction and conclusion chapters
Sept 15th, 2018, Delivery of full manuscript to the IET
Scheduled publication: Feb/March 2019
Readership: Graduate students and Researchers in the fields of Electrical and Computer Engineering, Computer Science, Mechanical Engineering, Civil Engineering, Humanitarian Engineering, Control Systems, Geoscience and Remote Sensing, Instrumentation and Measurement, Intelligent Transportation Systems, Oceanic Engineering, Safety Engineering, Reliability, Robotics and Automation, Signal Processing, Technology and Engineering Management, Environmental Engineering, Public Health Management, Non-Invasive Testing/Monitoring and Vehicular Technology.
Additional Information: Dr. Vania V. Estrela, vania.estrela.phd@ieee.org
Dr. Jude Hemanth, jude_hemanth@rediffmail.com
Bommer SC and Fendley M. (2015). Assessing the effects of multimodal communications on mental workload during the supervision of multiple unmanned aerial vehicles. International Journal of Unmanned Systems Engineering. 3(1): 38-50. Human... more
Bommer SC and Fendley M. (2015). Assessing the effects of multimodal communications on mental workload during the supervision of multiple unmanned aerial vehicles. International Journal of Unmanned Systems Engineering. 3(1): 38-50. Human supervisory control systems support force multiplication as they enable a single pilot to control multiple unmanned aerial vehicles (UAVs). As the ratio of UAVs per operator increases, there is a clear need to understand the workload demands required of the pilot to complete mission requirements. One critical element of the pilot's task is to successfully manage multiple channels [modes] of communication. This research explores the impact of changes in mission communications on the mental workload (MWL) of pilots. MWL was evaluated using two subjective measures: Crew Awareness Rating Scale (CARS) and the National Aeronautics and Space Administration – Task Load Index (NASA-TLX); and two physiological measures: fixation duration and pupil dilatio...
Related Topics