Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Software Uav

Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

A Modular Software Architecture for UAVs

Taygun Kekec, Baris Can Ustundag, Mehmet Ali Guney, Alper Yildirim, Mustafa Unel
Faculty of Engineering and Natural Sciences
Sabanci University
Orhanli-Tuzla 34956, Istanbul, Turkey
Email: {ikekec,bcanustundag,maliguney,alperyildirim,munel}@sabanciuniv.edu

Abstract—There have been several attempts to create scalable In Pixhawk project [9], an aerial middleware called MAV-
and hardware independent software architectures for Unmanned CONN, is proposed. Capabilities of the architecture is com-
Aerial Vehicles (UAV). In this work, we propose an onboard pared with ROS (Robotics Operating System) and LCM
architecture for UAVs where hardware abstraction, data storage (Lightweight Communications and Marshalling). ROS is an
and communication between modules are efficiently maintained. open-source robotics operating system [10], developed by
All processing and software development is done on the UAV
Willow Garage and it is preferred a lot due to its variety
while state and mission status of the UAV is monitored from a
ground station. The architecture also allows rapid development in software components. LCM [11] composes of libraries
of mission-specific third party applications on the vehicle with and software components for communication in soft real-time
the help of the core module. systems. MAVCONN acts as a bridge between ground operator
and low level system components, and also provides hardware-
level synchronization of visual and inertial data. It also exploits
I. I NTRODUCTION the upper layer of ROS architecture and uses LCM as the
communication layer due to its real-time message transmission
The availability of inexpensive, lightweight and compact capabilities. One relevant observation noted in [9] is that many
sensors, low-cost and thin computational platforms, and matu- robotic systems still employ polling as part of their design
rity of control system design capabilities have paved the way scheme which adds delay to the system and consumes a lot
to extensive usage of Unmanned Aerial Vehicles (UAVs). The of processing power due to the context switch. Asynchronous
applications of UAVs are becoming more and more apparent. design is faster when compared to a polling based design. As
Missions performed by UAVs include surveillance [1], [2], asynchronous designes require threads, an onboard middleware
reconnaissance [3], borderline security [4] and remote sensing needs to adopt multi-threaded implementation.
of environment [5]. Researchers try to optimize cost, scale and
flight endurance to come up with efficient solutions [6]. In Maza’s work [12], authors proposed an architecture for
UAV cooperation. Their architecture is divided into two layers.
An UAV is governed by a flight computer system. The First layer, Executive Layer is responsible for generating high
system reads and analyzes data from a wide variety of sensors level decisions and task planning. Second layer, Onboard
and produces a mission flight plan. For observation purposes, Deliberative Layer is responsible for the execution of the tasks.
UAV carries a payload for acquiring visual overview of the They define a task as having multiple discrete states where it
flight environment. While some of the preliminary works on can be nested into subtasks. The functionality of their system
the topic consisted of gathering visual data and processing it is shown on a load transportation application.
off-line, real-time processing is essential and indispensable for
missions like threat detection and object tracking. Lopez [13] proposed a middleware system, implementing
common functionalities and communication channels. In their
In addition to physical constraints of developing an UAV architecture, they propose a service container which acts as a
system [7], one needs a reliable, flexible, and scalable software plant for subscriber/requester data services. This container is
component on the flight computer. The software component the core of their architecture which handles name, network and
is responsible for providing hardware abstraction, triggering finally resource management onboard. Container automatically
security checks, handling unexpected conditions, monitoring handles message subscription, message failure conditions and
data and mission progress of the system. Moreover, the system message delivery. Their proposal is focused on the network
must create a software infrastructure for newly added tasks centric low-resource embedded applications.
so that new applications can communicate with the onboard
Honvault [14] proposed an architectural framework im-
software. Finally, the system must provide an interface for
plementing core components for the development of fault
communication of different sources. In Jones’ work [8], au-
tolerant and real-time applications. The framework which is
thors proposed a software architecture for the design and
built on top of a real-time operating system kernel has two
simulation of UAV-based setups. Their work mostly focused
facets. First layer provides key algorithms and services. Second
on developing a ground station module, can act as a simulator
layer allows development of new application context for new
and command controller, providing hardware-in-the-loop capa-
problems.
bility, simulation sensorial inputs, routing of shared data and
generating command requests. Using a graphical configuration The literature survey shows that although UAVs have
system, the user can create artificial events for triggering new different capabilities and mission complexities, they require a
actions at specified times to test behaviors and responses of unified software architecture where components communicate
the multiple UAV system. efficiently. Moreover the data must be analyzed during the
Fig. 1: Block diagram of proposed system architecture.

flight. In this work, we propose a generic architecture for A. Core Module


achieving these goals. Unlike some cooperatively architectured
systems, we narrow our focus on software framework of indi- Hardware Abstraction Layer
vidual UAVs. Proposed onboard software framework maintains Hardware Abstraction Layer (HAL) is responsible for
generic hardware abstraction, data storage and communication managing input/output operations of system peripherals and
duties. The developers are able to monitor the data, create onboard sensors. Lower part of the layer consists of Serial,
new task blocks where tasks communicate with onboard core SPI, PPM and I2C communication stacks for data acquisition.
module via Inter Process Communication. Upper part of the layer is responsible for copying data to shared
memory using multiple threads.
The rest of the paper is organized as follows: In Section
2, we present our proposed architecture. In Section 3, we The working principle of the proposed hardware abstrac-
show implementation details of our architecture. Experimental tion layer is time-driven where each thread has a predefined
results of the platform is presented in Section 4. The paper is working period. Acquired data is copied to shared memory in
concluded with some remarks and future directions in Section parallel with the help of several threads. Each thread’s period
5. differs with respect to the update rate of the associated device.
Using this methodology, shared memory gets an update each
time a thread cycle is completed.

II. P ROPOSED A RCHITECTURE Shared Memory


In our proposed architecture, onboard core module stores
In this section we will describe parts of our proposed all numerical data on a shared memory. The memory is updated
architecture. Proposed architecture mainly consists of three at several intervals with the help of hardware abstraction layer
main modules. First module, denoted as core module, is the threads. This area is not only accessible to all blocks of core
heart of the architecture where all communication, data storage module but also accessible read-only to application module.
and mission management are handled. Second module is Because shared memory is volatile, all fields of shared memory
application module which includes mission-specific programs are copied to a non-volatile disk memory using a logger block.
like object tracking and waypoint following. Third module is
ground station module where operator performs data analysis, Cooperative Memory
visualization and remote command execution. The overview
of the proposed architecture can be seen in Figure 1. In what The proposed architecture adopts decentralized commu-
follows, we will describe functionality of core module’s layers nication between UAVs. There are two reasons for using a
and blocks. decentralized communication method. First, when only some
parameters or even trigger mission termination through mission
manager. This type of communication is done by User Service
block under TCP/IP. Furthermore shared memory and system
state are broadcasted to the ground station through TCP/IP by
Broadcaster block. Due to the high bandwidth requirements,
Broadcaster block must transmit visual data to ground station
under UDP/IP.
Third, core module must communicate and share system
data with other UAVs. For this purposes, we reserve a UAV
P2P( Peer to Peer ) Service block. Adopting event-driven
schema, the block updates tables in cooperative memory of
UAV when a message is triggered. It can also serve as a
Fig. 2: Internal blocks of mission manager. pipeline for transmitting high priority messages of an UAV,
to the ground station through a nearby UAV.

of UAVs are in the range of the ground station, remaining Logger Block
UAVs can receive updates through their peers. Second, some The architecture implements failure detection on system
missions can require exchanging information in the fastest way, monitor block. However for simulation and playback purposes,
where delay caused by communicating over a ground station volatile data storage must also be supplied with non-volatile
can severely affect sharing even packages of short length. storage on the system. For this purposes, logger block is
Details and implementation of this block is reserved for future responsible for storing three types of data: shared memory,
work. system state and mission progress data. One can expect first
two to be recorded during whole flight while missions are
Configuration Block recorded from mission startup to mission completion.
This block stores all system specific settings for onboard In our logging implementation, we adopted a thread-based
core module. Various settings include network settings, device logging activity. Logging thread periodically writes shared
access settings for hardware abstraction layer. This block is memory and system state into flight logging directory. More-
accessible to all blocks of core module. Moreover, operator over, the block receives periodic announcements of mission
can also access and modify system settings through a User based data from mission manager block. Each mission has its
Service block via TCP/IP connection. own logging directory. The block can also be configured to
broadcast all logs to the network. This is beneficial for small
System Monitor and System State scale platforms as they may not have sufficient storage space.
Core module has all numerical data required for performing
algorithms. A flight system also requires detection of failures Mission Manager
and malfunctioning. In this architecture, system monitoring This block is responsible for registering new and moni-
block analyzes data residing in shared memory and produces toring ongoing missions. On new mission execution request,
logic-level system states such as validity of each sensor’s mission creator analyzes incoming mission request, validates
functionality and battery level monitoring. Output of system request after analyzing mission requirements from system state
monitoring block is a number of logical data stored in system block. Creator establishes new mission into mission queue as
state block. Logical data stored in system state is also used in Fig. 2.
by mission manager, as most of the missions on UAV require
certain state configuration. Mission executor keeps track of missions in the queue.
Each mission must report its progress and state to mission
Communication Layer executor periodically. If no report is obtained from running
mission or mission criteria fail to be sufficed, mission is
The onboard software architecture must transmit/receive immediately removed from the mission queue while generating
various data. When the UAV is operational and core module is a mission abort message. Missions also have priorities. Core
online, the communication requirements can be grouped into module has a few built-in missions (e.g. emergency land)
three parts. which have higher priority than developed missions. Automatic
First, core module must communicate with other onboard triggering of such a mission causes Mission Executor to put a
task-specific applications. When an application goes online, new mission to Mission Queue resulting immediate execution
it also needs to provide packages about itself to the Mission of prioritized mission whilst current mission gets postponed.
Manager block such as Mission Definition, Mission Start, and
Mission Status. Mission Manager will respond each interval B. Application Module
whether it is suitable to continue the mission or notify termi-
Application module is the space where mission based appli-
nation of a mission by core module. This communication is
cations are stored. As operations like mission logging, mission
done by using an IPC (Inter Process Communication) schema.
preemption, accessing peripherals are already maintained by
Second, core module must communicate with the ground core module, one can focus on the task and algorithms while
station. An operator can also desire modifying configuration receiving essential services from core module.
Each mission start is triggered by the user from the ground Gumstix microcomputer is utilized as the high level
station module. Mission manager checks whether mission re- controller of our system. The microcomputer is small and
quirements are fulfilled, and triggers execution of correspond- lightweight, weighting only 6 gr. It has a 600 MHz OMAP
ing mission from application module. These requirements are 3503 microprocessor and C64+ Digital Signal Processor, and
periodically checked from Mission Executor subblock. All a 256 MB DDR RAM. The microcomputer also has 4 hardware
mission progress is logged and can be transferred through PWM channels and fine input/output capabilities where all
User Service to the ground station module. The mission must voltage regulations are performed by its expansion board.
send acknowledgment to core module periodically or will be The device runs Angstrom Linux distribution. We made slight
terminated due to security purposes. modifications on the operating system for improving efficiency.

C. Ground Station Module


Operator can view real-time video of UAV, and send
mission execution commands like vehicle landing and vehicle
take-off to core module. Core module’s shared memory and
system states are fully observable using this module. Due to
high data transmission bandwidth, it operates through 802.11x
wireless protocol.

III. I MPLEMENTATION
The proposed architecture is implemented on our UAV
(Fig. 3). It employs a Gumstix Overo microcomputer and
Texas Instruments’ TMS320F28335 microcontroller for flight Fig. 4: Real-time data analysis on operator interface.
control. We implemented lower part of hardware abstraction
layer on the microcontroller. Rest of the blocks of core
module and application module are implemented on Gumstix
microcomputer. Ground station module is implemented on a
laptop.

A. Low and High Level Controllers


We employ TMS320F28335 microcontroller for low level
control tasks as well as interfacing sensors. Microcontroller
has 150Mhz processor speed, 68KB RAM and 512KB Flash
memory. The microcontroller is highly capable; it supports 6
capture channels, 16 PWM channels and 16 ADC channels. It
provides 96 interrupts of which 58 are reserved for input/output
units. Unlike many microprocessors, TMS320F28335 employ
zero clock cycle when switching between interrupts. We Fig. 5: UAV’s position on earth and surveillance video on
implemented 100Hz PID control for low level control task operator interface.
on the microprocessor. For tuning purposes, control gains
can be modified during the flight from the ground station
module where it will be directed to hardware abstraction
layer by the help of the core module. Also calibration of B. Sensors
the electronic speed controllers and other hardware follow the We use OmniVision OV 3640 camera of E-Con Systems.
same procedure. The camera has 3.2 Megapixels resolution running under V4L2
driver. It supports resolutions from 320x240 to 2048x1536
supporting Raw RGB, RGB565, YUV, YCbCr image formats.
Most of the time, image acquisition is done via microproces-
sor. However, utilizing Gumstix’s DSP speeds up acquisition
process significantly. In this work, we use DSPLink library of
Texas Instruments for image acquisition so that microcomputer
can allocate more processing time to different tasks. On user
prompt, frames are transferred to the ground station under
H264 encoding format.
We use CHR-6dm inertial measurement unit which com-
bines 3d rate gyros, accelerometers, and magnetic sensors.
IMU has 32 bit ARM Cortex processor where it comes with
an onboard Extended Kalman Filter implementation reporting
Fig. 3: Implementation platform SUQUAD. roll, pitch and yaw angles at up to 300Hz over a TTL (3.3V)
UART interface. In order to measure distances and avoid
Proposed MAVCONN ROS
Scale lightweight lightweight middleweight
Availability Open Source Open Source Open Source
Ground Station Module + + -

TABLE I: Comparison of similar middlewares


Proposed PIXHAWK Asct. Pelican
Autopilot TMS320F28335 ARM7 ARM7
AP Mhz 150Mhz 60Mhz 60Mhz
AP RAM 68Kb 32Kb 32Kb
Open HW - + -
Fig. 6: Altitude Control Performance
TABLE II: Comparison of autopilot systems

obstacles, we use MaxBotix EZ4 ultrasonic sensors which give


resolution of 1 inch with 20Hz reading rate. The sensor can
measure up to 6.45 meters. For high level control tasks GPS
module EM-406A, having sensivity of -159dBm, is used. Peer
to peer radio frequency communication is done via Digi Zigbee
OEM RF module. This device has a communication range of
100 m indoor and 1.6 km outdoor. The module supports point-
to-point and peer-to-peer communication topologies.

C. Ground Station

Ground station module is implemented on a computer


having 2.0 Ghz I5 Intel Core2Duo processor and 4GB RAM.
The ground station is based on open source QGroundControl
framework. We modified this open source software for our
needs. The software also supports real-time plotting of data
(Fig. 4). The operator can also track UAV’s position on earth
with built-in Google Earth plugin as well as on simulated 3D
environment (Fig. 5). Although ground station supports a wide
functionality, the implementation is noticeably slow. We plan
to replace whole ground station system with a new one in
future work.

IV. E XPERIMENTAL R ESULTS Fig. 7: Control Inputs


We demonstrate an experimental flight of our platform,
running proposed architecture. A comparison to similar avail-
able middlewares [9], [10] and autopilot platforms is shown
at Table I and II. The key concept of our architecture is
preserving minimality while providing essential functionality.
Moreover, proposed architecture is open source and available.
We implemented 100Hz PID control for attitude and altitude
control tasks on the microprocessor. The attitude control keeps
the orientation of the quadrotor to the desired value. The initial
orientation of the quadrotor before takeoff was −0.1◦ of φ and
0.4◦ of θ due to non-flat surface. Results of attitude control
during hover is shown in Fig. 8. It can be seen that attitude
angles oscillate in bounded interval of (−2◦ , 2◦ ) whereas they
rarely pass ±1◦ . The attitude control is done using cheap
sonar sensor. As measurement of sonar is extremely noisy and
hard to model, we applied median filtering to the Z position
measurements. Altitude control using filtered measurements is
shown in Fig. 6. Control efforts of the flight can be seen in Fig.
7. It can be seen that there are very few momentary oscillations
in U 1 control. This shows that quality of hover is very good. Fig. 8: Attitude Control Performance
Additional images of various flights is shown on Fig. 9.
Fig. 9: SUQUAD during flight experiments.

V. C ONCLUSION AND F UTURE W ORKS [5] Y. Lin, J. Hyyppa, and A. Jaakkola, “Mini-uav-borne lidar for fine-scale
mapping,” Geoscience and Remote Sensing Letters, IEEE, vol. 8, no. 3,
In this work, we have proposed an onboard software pp. 426–430, 2011.
architecture for manipulating common tasks. The architecture [6] H. Lim, J. Park, D. Lee, and H. J. Kim, “Build your own quadrotor:
is responsible for memory management, distribution of sensory Open-source projects on unmanned aerial vehicles,” Robotics Automa-
data, and communicating with other processes and ground tion Magazine, IEEE, vol. 19, no. 3, pp. 33–45, 2012.
station. It also simplifies mission-specific application devel- [7] E. Cetinsoy, S. Dikyar, C. Hancer, K. Oner, E. Sirimoglu, M. Unel, and
M. Aksit, “Design and Construction of a Novel Quad Tilt-Wing UAV,”
opment by providing a user-friendly interface. Mechatronics, vol. 22, no. 6, pp. 723–745, 2012.
The proposed system is implemented on our UAV using [8] E. D. Jones, R. S. Roberts, and T. C. S. Hsia, “Stomp: a software ar-
chitecture for the design and simulation of uav-based sensor networks,”
C++ language and continues to be developed. In future work, in ICRA’03, 2003, pp. 3321–3326.
we plan to add an extensive P2P communication layout and
[9] L. Meier, P. Tanskanen, L. Heng, G. Lee, F. Fraundorfer, and M. Polle-
converting mission manager into a cooperative mission sched- feys, “Pixhawk: A micro aerial vehicle design for autonomous flight
uler. We also plan to develop a ground station module which is using onboard computer vision,” Autonomous Robots, vol. 33, pp. 21–
faster than current one. Moreover we will investigate whether 39, 2012.
real-time capabilities of Linux platform is powerful enough to [10] M. Quigley, K. Conley, B. P. Gerkey, J. Faust, T. Foote, J. Leibs,
implement the whole HAL on the high level computer. R. Wheeler, and A. Y. Ng, “Ros: an open-source robot operating
system,” in ICRA Workshop on Open Source Software, 2009.
[11] A. Huang, E. Olson, and D. Moore, “Lcm: Lightweight communications
R EFERENCES and marshalling,” in Intelligent Robots and Systems (IROS), 2010
IEEE/RSJ International Conference on, 2010, pp. 4057–4062.
[1] M. Quigley, M. Goodrich, S. Griffiths, A. Eldredge, and R. Beard, [12] I. Maza, K. Kondak, M. Bernard, and A. Ollero, “Multi-uav cooperation
“Target acquisition, localization, and surveillance using a fixed-wing and control for load transportation and deployment,” J. Intell. Robotics
mini-uav and gimbaled camera,” in Robotics and Automation, 2005. Syst., vol. 57, no. 1-4, pp. 417–449, Jan. 2010.
ICRA 2005. Proceedings of the 2005 IEEE International Conference
on, 2005, pp. 2600–2605. [13] J. López, P. Royo, E. Pastor, C. Barrado, and E. Santamaria, “A
middleware architecture for unmanned aircraft avionics,” in Proceedings
[2] R. Beard, T. McLain, D. Nelson, D. Kingston, and D. Johanson, “De- of the 2007 ACM/IFIP/USENIX international conference on Middleware
centralized cooperative aerial surveillance using fixed-wing miniature companion, ser. MC ’07. New York, NY, USA: ACM, 2007, pp. 24:1–
uavs,” Proceedings of the IEEE, vol. 94, no. 7, pp. 1306–1324, 2006. 24:6.
[3] P. Iscold, G. A. S. Pereira, and L. A. B. Torres, “Development of a [14] C. Honvault, M. Le Roy, P. Gula, J. C. Fabre, G. Le Lann, and E. Born-
hand-launched small uav for ground reconnaissance,” Aerospace and schlegl, “Novel generic middleware building blocks for dependable
Electronic Systems, IEEE Transactions on, vol. 46, no. 1, pp. 335–348, modular avionics systems,” in Proceedings of the 5th European confer-
2010. ence on Dependable Computing, ser. EDCC’05. Berlin, Heidelberg:
[4] J. Dufrene, W.R., “Mobile military security with concentration on Springer-Verlag, 2005, pp. 140–153.
unmanned aerial vehicles,” in Digital Avionics Systems Conference,
2005. DASC 2005. The 24th, vol. 2, 2005, pp. 8 pp. Vol. 2–.

You might also like