This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.
IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING
1
Automating High-Precision X-Ray and Neutron
Imaging Applications With Robotics
Joseph A. Hashem, Mitch Pryor, Sheldon Landsberger, James Hunter, and David R. Janecky
Abstract— Los Alamos National Laboratory and the University
of Texas at Austin recently implemented a robotically controlled
nondestructive testing (NDT) system for X-ray and neutron
imaging. This system is intended to address the need for accurate
measurements for a variety of parts and, be able to track
measurement geometry at every imaging location, and is designed
for high-throughput applications. This system was deployed in a
beam port at a nuclear research reactor and in an operational
inspection X-ray bay. The nuclear research reactor system consisted of a precision industrial seven-axis robot, 1.1-MW TRIGA
research reactor, and a scintillator-mirror-camera-based imaging
system. The X-ray bay system incorporated the same robot,
a 225-keV microfocus X-ray source, and a custom flat panel
digital detector. The robotic positioning arm is programmable
and allows imaging in multiple configurations, including planar,
cylindrical, as well as other user defined geometries that provide enhanced engineering evaluation capability. The imaging
acquisition device is coupled with the robot for automated image
acquisition. The robot can achieve target positional repeatability
within 17 µm in the 3-D space. Flexible automation with
nondestructive imaging saves costs, reduces dosage, adds imaging
techniques, and achieves better quality results in less time.
Specifics regarding the robotic system and imaging acquisition
and evaluation processes are presented. This paper reviews
the comprehensive testing and system evaluation to affirm the
feasibility of robotic NDT, presents the system configuration, and
reviews results for both X-ray and neutron radiography imaging
applications.
Note to Practitioners—While looking for ways to improve
throughput and increase efficiency in nondestructive imaging
applications, the NonDestructive Testing and Evaluation Group
at the Los Alamos National Laboratory decided to take a look
at automation opportunities. Digital radiography and computed
tomography are time-consuming processes, making them ideal
candidates for robotic solutions. Radiography applications often
require several images to be acquired from different angles and
a lot of time they have to be very precise so that the feature
of interest is identifiable and the resulting image meets the
client’s requirements. With the robot acting as the motion control
system, the imaged part can be placed directly in the beam path
and oriented in six degrees of freedom. The robot can achieve
significantly higher levels of precision than a human and has the
Manuscript received August 25, 2016; revised December 20, 2016 and
January 31, 2017; accepted February 21, 2017. This paper was recommended
for publication by Associate Editor A. Pashkevich and Editor J. Wen upon
evaluation of the reviewers’ comments.
J. A. Hashem, J. Hunter, and D. R. Janecky are with the
Los Alamos National Laboratory, Los Alamos, NM 87545 USA (e-mail:
jhashem@lanl.gov; jhunter@lanl.gov; janecky@lanl.gov).
M. Pryor and S. Landsberger are with the Nuclear and Radiation
Teaching Lab, Department of Mechanical Engineering, University of
Texas, Austin, TX 78712 USA (e-mail: mpryor@utexas.edu; s.landsberger@
mail.utexas.edu).
Color versions of one or more of the figures in this paper are available
online at http://ieeexplore.ieee.org.
Digital Object Identifier 10.1109/TASE.2017.2675709
ability to adjust the part while the source is active. The system
also reduces levels of radiation our staff is exposed to, as the robot
is setup to handle radioactive and hazardous parts. Not only does
the robot move parts more precisely and with higher resolution
than humans, but it also adds additional flexibility in the type
and nature of images that the lab can produce. Future work will
involve using this system for advanced automated scans such as
achieving evenly spaced views around a sphere autonomously,
since this system has not yet been used for more advanced scans
beyond helical scanning. A tightly linked feedback loop between
the robot and imaging code in which the imaging code would
autonomously communicate to the robot what additional views
are needed to reduce imaging error can also be explored.
Index Terms— Autonomous system, calibration, collision avoidance, computed tomography (CT), flexible automation, helical
scanning, motion control, nondestructive testing (NDT), path
planning, precision movement, radiation damage, radiography,
software communication.
I. I NTRODUCTION
ONDESTRUCTIVE testing (NDT) is a highly multidisciplinary group of techniques used throughout science
and industry aimed at evaluation of material properties and
detection of defects, both surface and internal, without causing
physical damage to the inspected components [1]. Automation
of NDT of engineering components represents one of the
key objectives of many industries, including the automotive, aerospace, petrochemical, power generation, and nuclear
industries. In contrast to manual inspection, it keeps humans
away from hazardous material and potentially dangerous work
and enables increases in accuracy, precision, and speed of
inspection while reducing production time and associated labor
costs. The use of robots can provide additional autonomy
and flexibility to automated NDT, whereas manual inspection
of numerous components or large structures is laborious and
time consuming. Automated robot inspection can be beneficial in diverse industrial scenarios ranging from integration
of NDT into manufacturing processes of complex geometry
components to periodical repair of large structures, such as
turbines and aerospace components. Additionally, unusual
environmental threats, such as those from underwater oil
spills and nuclear power plant accidents, have caused renewed
interest in fielding radiography in severe operating conditions.
These severe operating conditions pave the way for remote
handling systems, where robots are increasingly deployed in
remote and hazardous environments, such as those found in
the nuclear waste management field and other radioactive environments [2]. The Department of Energy (DOE) has in particular targeted robotic handling of hazardous waste to be an
essential element in its efforts of environmental restoration and
N
1545-5955 © 2017 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission.
See http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.
2
waste management [3]. Within the DOE complex, the primary
purpose of robots is to replace (or augment) human operators to increase safety without adversely impacting process
efficiency. Remote-operated robots allow access and manipulability to areas that would otherwise be inaccessible due to
radiation levels, enabling repairs, maintenance work, inspections, or other tasks [4].
A fundamental problem with NDT of manufactured components lies in the process variability. Often parts that are
designed identically may have minor to significant deviations from their intended design or even from one part to
another. This presents a challenge for precision NDT measurement, since the NDT deployment method must be flexible
to accommodate the differences in component shapes. For
this reason, NDT inspections are often performed manually
by technicians who typically have to position and move a
component or inspection probe to achieve proper alignment.
This requires trained technicians and results in long inspection
times. Achieving a high level of precision and repeatability
for a test can be difficult when part alignment is necessary
to perform the inspection [5]. Therefore, the fundamental
aims of automation within the NDT process are to minimize
downtimes due to the higher achievable speed, to minimize
variability due to human factors, and to reduce the human
exposure to hazardous or physically constrained environments.
Semiautomated inspection systems have been developed to
overcome some of the shortcoming with manual inspection
techniques, using both mobile and fixed robotic platforms.
Some NDT techniques, such as video inspection, eddy current
testing, and ultrasonic testing, have been readily automated [6].
There are also numerous applications of climbing robots [7]
and autonomous mobile robots [8] for inspecting large structures where human access is limited due to space limitations
or hazards. Mobile robots carrying inspection tools, such as
cameras, 2-D lasers, and IR sensors, have been used for sewer
pipe inspection to look for damage or abnormalities [9]. Linear
manipulators and bridge designs provide high positioning
accuracy [10] to many of these semi-autonomous systems.
Typically, these systems are specific machines that are used
to inspect identically shaped and sized parts; therefore, they
have limited flexibility in the methods they use and what
types of parts they can inspect. Automated X-ray test systems
for workpieces can be found commercially [11]; however,
they come preconfigured for use with specific software and
hardware. Despite these previous efforts, challenges remain
to be addressed before fully automated, high-precision NDT
inspection becomes commonplace. The key challenges include
flexible trajectory planning, integrated NDT data collection,
and achieving high repeatability and precision measurements.
The result of NDT imaging inspection requires a flexible and
extensible approach that has the flexibility to allow future
changes in the path planning to accommodate different parts
and techniques.
This effort focuses on flexible motion control system for
NDT applications in radioactive and other hazardous environments that are complex, requiring multidisciplinary knowledge
in order for the motion system to complete its required task(s).
Flexible NDT is necessary to address the vast majority of NDT
IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING
applications which are experimental, low (or often singular)
batch, and/or require in situ modification of test parameters.
As computer vision and automated inspection techniques
improve, automated imaging inspection and interpretation can
become a reality. Research and development is, however,
ongoing into new approaches that can be used to aid X-ray
interpretation [12].
Today, any particle type can be combined with an increasingly wide range of digital detectors to image almost any
conceivable object in extreme environments. As the industrial
radiography sensitivity and resolution requirements increase,
along with the range and complexity of items to be inspected,
high-precision positioning systems become necessary. Imaging
techniques rely heavily on positioning the part in a precise
location inside the X-ray beam and inline with the imaging
detector, not just once, but many times (up to thousands
of related but distinct positions). Typical imaging techniques
include: static radiography, computed tomography (CT), and
helical scanning [13]. The NDT and Evaluation Group at the
Los Alamos National Laboratory (LANL) has implemented
various linear and rotary motion stages to position parts for
digital radiography (DR) and CT [14]. Additional techniques
such as helical scanning and in-motion radiography have been
implemented [15]. CT scan requirements are a good example
of the high precision and repeatability requirements for X-ray
imaging. CT typically requires one view per pixel of maximum
region of interest width. In tomography, a variety of artifacts
may be present in projection sets that propagate errors back
into the reconstructed image. If fewer projections of the object
are captured, the image will have more reconstruction artifacts and poorer resolution, boundary definition, and uniform
voxel (i.e., the 3-D analog of the pixel in a 2-D image) spacing.
Thus, it is necessary to have uniformly spaced view angles.
With CT scans, it is important that images be precisely located
relative to each other. A six-axis positioning system, along
with a digital detector would yield the following benefits.
1) Enhanced Worker Safety: Less frequent access to the
beam area reduces the probability of an accidental
worker exposure, and helps maintain as low as reasonably achievable (ALARA) dose by keeping distance
between parts and workers [40].
2) Positioning Flexibility: A robotic positioning system
allows the radiographers to get images from any angle,
and parts can be repositioned remotely to better capture
the desired image.
3) Real Time Remote Positioning: Coupled to a real-time
digital imaging system, radiographers can remotely position the item to better investigate newly discovered areas
of interest.
In the sections to follow, we describe the robotic system, how
it is used, the integration between the components of the entire
imaging system, and evaluate the resulting X-ray and neutron
radiographs.
Robotics allow for high-mechanical rigidity and repeatability, configurable imaging motion planning, six DoF programmable positioning of payloads with both course and fine
movement capability, and kinematic model visualization. For
example, the robotic arm used in this paper (see Section II-C
This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.
HASHEM et al.: AUTOMATING HIGH-PRECISION X-RAY AND NEUTRON IMAGING APPLICATIONS WITH ROBOTICS
3
Fig. 1. Part positioning system installed in the open-air X-ray bay at LANL.
The setup consists of a computer controlled robot and a flat panel digital
detector that communicate with one another autonomously.
for more specifics on the robot model) is capable of moving
a 5-kg payload over a near-arbitrary path within a volume
of roughly 0.4-m radius, while achieving a repeatability
of 17 µm. Other NDT applications such as eddy current [16]
or ultrasonic inspection of aerospace components [17] have
implemented the use of a robotic arm for sample exchange
and/or part positioning. However, these applications do not
necessarily require the same level of precision as the imaging
applications required in this paper, with an example shown
in Fig. 1.
Fig. 2. Sample handling process. The first “Wait for input” block allows
the user to either choose a sample to pick up or turn the system OFF, while
the second “Wait for input” block asks the operator to enter the desired rotation
and offset amounts.
II. M ETHODS AND G ENERAL D ESIGN C ONCEPT
The application task is to autonomously image different
parts, one-at-a-time, and with high precision. The application
begins with the objects either randomly distributed or in
a predetermined position in the robot’s workspace. If the
objects are randomly distributed, an imaging system is used to
determine their location using sensor information from a depth
camera [18]. A schematic of the generalized cycle of sample
handling is shown in Fig. 2. The process shown here represents
the X-ray system. The X-ray generating device (XGD) and
digital imaging detector are also included in Fig. 2. The
robotic system deployed in the nuclear research reactor follows
a similar sample handling process, except that the XGD is
replaced by the neutron beam and the detector consists of
the scintillator-mirror, camera system. The neutron imaging
system is described in detail in Section III-B. For both
X-ray and neutron systems, the task is always performed in
simulation prior to hardware execution to make sure the robot
completes the task as expected and safely.
A risk analysis was carried out, identifying breakdown
recovery as a particular area of concern because of the
potential handling of radioactive material and the restrictions on personnel access as radiation levels increase. The
robotic system therefore includes features to either avoid
foreseeable problems or to enable recovery in the event of
failure when the robots are handling highly radioactive parts.
One example of problem avoidance is the implementation of
collision avoidance. Another example is robot joint current
monitoring during operations; if a joint current exceeds its
threshold value, the robot will stop. This monitoring allows
detection of problems before the robot generates large forces
on fixed surfaces or other equipment in the area. This ensures
recovery is not hampered by the robot having to “cut out”
after exceeding the maximum joint current limit.
The system permits the operator to easily cancel
autonomous execution at any time and revert to teleoperation, or issue new high-level motion commands, such as “stow
robot behind shielding” or “place part down,” without the need
to restart any hardware or software. The user is kept in-theloop, so one is able to intervene in the case of an emergency.
This type of flexibility permits operation under transitional
autonomy, and is not possible under the traditional teach/do
paradigm of industrial robot programming.
A. NDT Imaging Requirements
The goal of performing flexible imaging and maintaining required positioning accuracy require that positioning
is correctable in translation (x, y, z) and in pointing
angle (Rx, Ry, Rz). Additionally, multiple axes need to be
aligned (Fig. 3) and the imaged part’s axis of rotation needs
to be aligned with the system’s z-axis (i.e., the detector) and
the robot axes. According to LANL radiographers, the most
precise of product specifications call out 0.05-mm minimum
feature requirements to be measured for radiography inspection in the LANL Plutonium Facility. These features are
measured to 0.025 mm. Vibrations cannot exceed 6 µm (i.e.,
one quarter of the maximum requirement) while the part is
held in order to allow accurate measurements. Also, the tilt
of the part relative to the X-ray beam needs to be known to
within 0.1°. These requirements represent typical requirements
for NDT applications surveyed at LANL.
This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.
4
IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING
Fig. 4. 3-D blank beam image showing beam structure due to neutron guides.
Fig. 3. Robot, X-ray source, flat panel detector, and coordinate system
definitions. Imaging sphere is not drawn to scale.
B. Summary of Utilized Systems
The key hardware elements utilized to be developed in the
automation imaging system include the following:
1) seven axis industrial manipulator and gripper capable for
part manipulation;
2) X-ray source;
3) neutron source;
4) operational software for flexible automation capability
and peripheral integration including vision.
helps to maintain positioning accuracy and minimize errors
throughout the entire radiograph exposure time. This level
of correction is not possible with conventional stacked-axes
positioning systems that typically have fewer than six DoF
as well as a small reachable workspace. Additionally, stacked
motion stages can only move a single axis at a time. A robot
can be taught and/or commanded to perform a sequence of
complicated moves. In both robotic and motion stage-based
systems, the volume occupied by the item inspected and
that items swept volume path must also be accounted for.
An analysis of the robot’s workspace and collision avoidance
is described in Section II-F.
D. X-Ray Source and Digital Detector
C. Seven-Axes Robot for Part Manipulation
To address the mechanical challenges posed by the highprecision requirements in NDT, we have implemented the
use of a seven-axis serial robot with a 5-kg payload (model
Yaskawa SIA5 and FS100 controller [19]). A Robotiq gripper (both three-fingered [20] and two-fingered models [21]
were used). The tip of the end effector (EEF) is also commonly
referred to as the tool center point (TCP). The robot controller
uses the Denavit–Hartenberg parameters [22] to define the
kinematics of the robot, which allows the part to be spatially
oriented.
The choice of this industrial-grade, robotic arm allows for
the reduction of time needed for inspection, high precision,
affordability, reliability, as well as integration with open source
software which simplified the integration of peripheral devices,
vision, and other supporting components. The coordinated
motion and part offsets must provide the ability to maintain
positional accuracy for various parts, accounting for varying
part sizes and weights. The robotic software combined with
the well-developed robotic controller has integrated timing
mechanisms that provide input and output signals to easily
coordinate the X-ray source, flat panel detector, and other
measurement equipment precisely at desired times of the
imaging application. Polar rotation of the part is accomplished
by inputting the part offset distance and tilt relative to the
robot’s tooling offset and then selecting a desired part rotation.
Indents in the robot’s gripper provide for repeatable part
positioning.
The coordinated motion of the robot allows for full spatial error correction, including position and orientation. This
The system was installed in an open-air X-ray bay at LANL
with a 225 keV microfocus X-ray source. A custom Varian
2520 (25 cm × 20 cm) amorphous silicon flat panel digital
detector [23] that is tolerant to high-energy X-rays was used
to acquire digital radiographs. All radiographs throughout this
work were imaged with the XGD set at 135 keV and 66 mA.
The exposure time was set to 0.5 s for each image (i.e.,
2 frames/s) with 50 frames averaged for each image. For
CT and helical scans, the number of averaged frames was
reduced to 10 to decrease scan time. The focal spot size was
set to 7 µm and a 6:1 magnification was used (source to
object distance was 280 mm and source to detector distance
was 1701 mm).
E. Neutron Source and Scintillator-Mirror-Camera System
Neutron imaging was performed at the University of
Texas (UT) Austin’s TRIGA Mark II research reactor.
In the beam port, there is a thermal neutron flux of
5.3 × 106 n cm−2 s−1 and thermal-to-epithermal ratio of
8.1 × 104 ± 10% n cm−2 s−1 at a reactor power of 950 kW.
Neutrons are directed from the reactor core to the beam port
through a neutron guide, which reflects neutrons in a manner
that is analogous to optical reflection. The neutron guide
acts as a neutron filter; only neutrons of certain energy are
efficiently transported down the guide. Neutron guides work
best for lower energy neutrons as the reflectivity of higher
energy neutrons is low. From Fig. 4, one can see that there
is structure in the beam, which is due to the neutron guides.
When calibrating images this structure is accounted for by
This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.
HASHEM et al.: AUTOMATING HIGH-PRECISION X-RAY AND NEUTRON IMAGING APPLICATIONS WITH ROBOTICS
Fig. 5.
5
Neutron imaging acquisition setup.
normalizing the blank beam measurement. The beam is a very
clean neutron source and is also a divergent beam.
A scintillator-mirror-camera system (Fig. 5) was utilized
to acquire digital radiographs. This allows the operator to
know what the radiograph looks like in real-time and allows
for adjustments in part positioning to be made online. The
neutron imaging detector’s pixel pitch, which is the physical
distance between the pixels in the imaging device, is 35 µm.
The charge-coupled device camera had a 1004 × 1002 active
pixel area on the chip, with a pixel size of 8 × 8 µm,
and a lens magnification of 1:1.14. The scintillator used
was a copper, aluminum, and gold doped 6 LiF ZnS neutron
detection screen. The overall detector resolution of the system
is limited by the scintillator resolution (i.e., 6 lp/mm), which
is equivalent to the minimum resolution of 80 µm. For future
work, the modular transfer function should be calculated for
the neutron imaging system; however, the focus of this work is
to show the functionality and feasibility of the incorporation
of robotics into the imaging system. The estimated thermal
neutron capture efficiency is 42% at a scintillator thickness
of 0.45 mm, a 6 LiF:ZnS mix ratio by weight, and a 6 Li atomic
volume density of 12.9 atoms/ccx1021. [24] The reaction that
occurs is 6 Li + n → 4 H e + 3 H + 4.8 MeV, where the ejected
triton interacts with phosphor in the scintillator to create a
scintillation event. A stainless steel enclosure, along with lead
bricks and lead blankets, surrounded the enclosure to shield
against X-ray hits and background noise. Note that CT scans
were not performed using this neutron imaging system, since
exposure times to obtain a single image ranged from five to
ten minutes.
F. Robot Software
The robot operating system (ROS) was used for operating the robot, integrate all peripheral devices, and perform
many of the necessary calculations, including object recognition, pose estimation, trajectory generation, and collision
detection [25], [26].
The reachable workspace of the SIA5 was calculated and
is shown in Fig. 6 with a collision object added to the
robot’s workspace. This demonstrates the collision avoidance
Fig. 6. Collision object, which is the “gripper holder” is shown in purple in
the robot’s workspace (left). Isometric view of the robot’s workspace (center).
Top–down view of the workspace with the location of collision object shown
in the dashed red circle (bottom). Note that the lines in 2-D are projected
from above in the top–down view.
capability of the software and shows the workspace boundaries
of the manipulator with a Robotiq two-finger gripper attached.
From Fig. 6, it is evident that the range of motion from the
axis of rotation around the z-axis to the last point of the
EEF, in the y-axis direction is roughly 0.72 m. It should
be noted that this range is only theoretical. To acquire the
SIA5’s workspace, MATLAB’s Robotic System Toolbox [43]
was utilized to record the EEF path at 10 Hz as the robot
moved to random locations. The EEF position was recorded
by obtaining the transformation between the robot’s base and
the robot’s palm. A similar method can be used to obtain
the orientation of the EEF at each location, if desired. Joint
interpolated motions were used to complete each move.
It is apparent that the robot automatically avoids the collision object, as shown by the lack of motion inside the dashed
red circle in Fig. 6. There were a total of 400 000 moves tested
and no failures were identified—less than 1 in 400 000 failures (less than 2.5 per million moves) for the point at the EEF.
For actual experimental testing, the move can be simulated
first and then stored for a particular motion plan needed for
imaging applications.
G. Achieving Small Angular Motions for Tasks Requiring
High Precision
CT and helical scans [38] can require more than
4000 images, depending on the part geometry and dimensions,
taken at sequential angular steps. This requires the robot to
execute joint angle movements of less than 0.1°. In order to
do this, the software computes a very short trajectory for each
step with four points along each trajectory; the first and last
are the starting and ending point respectively and the middle
two are determined by linear interpolation. Without performing
this trajectory, the minimum increment a robot can move via a
This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.
6
IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING
simple joint motion command (i.e., arm->move()) is 0.287,
where arm is the name of the MoveGroup class that will be
controlled and planned for, and move() is a function within
MoveIt that moves the robot to a specified joint position [42].1
A 65.2% decrease (from 0.287° to 0.1°) in the minimum
joint angle step value was achieved by using this trajectory
algorithm. The same mini trajectory used to achieve small
joint motions can be used to give micrometer level commands
to the robot for Cartesian motions in ROS.
H. Part Alignment
Alignment of item center or other point of reference can be
one of the most challenging and time consuming aspects of
NDT imaging. This can be especially true for high-resolution
imaging, where digital panels and magnification can lower
the required alignment tolerances to hundreds of micrometers.
Having to keep track and maintain positions of the robot, gripper, and part to these tolerances is a challenging requirement.
To determine and maintain mechanical relationships among
these components offset frames between the robot’s EEF to
the gripper and to the desired part pose must be calculated.
To accomplish this, the robot kinematics must be calibrated
so that the robot knows where it is and where the held
part is relative to the measurement coordinate system. This
entails teaching the robot where the part is in relationship
to its native kinematic model, which allows the robot to
be commanded in the part’s coordinate system. To do this,
transformation matrices are derived, which allow the offset
frames to be directly interpreted as a set of movements in
the robot’s native coordinate system. The part frame is the
robot’s EEF frame plus a translational and rotational offset.
The position of the part frame, PF (x, y, z), relative to the
tool-center-point frame (located at the robot’s wrist faceplate),
TCP F (a, b, c), is mathematically determined by the 4 × 4
homogeneous transformation matrix represented by
ax bx cx px
ay by cy py
Part
(1)
TCP R = a
b z cz p z
z
0
0
0
1
where the coordinates of vector p = ( px , p y , pz ) represent the
location of PF and the coordinates of three unit directional
vectors a, b, and c represent the orientation of TCP F . The
inverse of the transformation matrix in (1) represents the position of frame TCP F to frame PF . The orientation coordinates
of frame TCP F in (1) can be determined by
ax bx cx
a y b y c y = Rot(z, θz )Rot(y, θ y )Rot(x, θx )
(2)
a z b z cz
where transformations Rot(x, θx ), Rot(y, θ y ), and Rot(z, θz ) are
pure rotations of frame TCP F about the x-, y-, and z-axes of
frame PF with the angles of θx (yaw), θ y (pitch), and θz (roll),
respectively.
1 This minimum joint value increment was determined experimentally using
an SIA5 manipulator, FS100 controller, ROS Hydro, and Ubuntu 12.04.
More work needs to be done to determine if this is an ROS, FS100,
SIA5, or MoveIt [42] limitation and why.
Fig. 7. X-ray system setup with integration of robot, ROS, gripper, controller,
and image acquisition device shown. For the neutron system, the flat panel is
replaced with the scintillator-mirror-camera system described in Section II-E.
After determining the EEF-to-part offset frame, the relationship between the robot and different coordinate systems can
be determined such that the part remains in the beam path
along the z-axis (Fig. 3). This alignment allows the robot to
rotate the part while maintaining the part’s alignment in the
beam and field-of-view of the detector. Knowledge of the part
volume and volume sweep is important and can be accounted
for by modeling the part in ROS a priori.
I. Coordination Between Robot, Imaging Detector, and
XGD or Neutron Beam
In order to accomplish motion planning, robot trajectory
execution, and integration with other hardware, it is necessary
to have a suitable framework. The developed software must
simplify robot integration with other NDT hardware components (the imaging detector and XGD or neutron beam). The
software developed in this paper also allows for the integration
of image acquisition devices to make the motion and image
acquisition autonomous.
Whereas the hardware system is composed of off-the-shelf
components, the software system is custom-built to provide the
flexibility required to automate integrated tasks. Separately, the
tasks themselves are not difficult, but integration into a single
system is nontrivial. Fig. 7 shows the major software and
hardware system components and interactions. ROS includes
the robot software and application code (e.g., pick, place, and
so on). The software (hardware drivers, algorithms, etc.) is
organized as a set of nodes that communicate via a standard
messaging protocol. This custom LANL automated radiography system has been developed to simplify application-level
programming.
System architecture was developed for integration of radiography imaging with a robotic manipulator where images
are automatically acquired at each step in the scan, through
communication between the robot’s motion control and
image acquisition. A modular architecture facilitates different
imaging devices integration and utilizes a Python script to
communicate between the robot and flat-panel detector to close
This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.
HASHEM et al.: AUTOMATING HIGH-PRECISION X-RAY AND NEUTRON IMAGING APPLICATIONS WITH ROBOTICS
the loop between the robotic motion and imaging acquisition
device. ROS allows for communication between the robot and
imaging devices using scripting languages (C++, Python, and
MATLAB) on the same PC. The code to run the robot is written in C++, and the flat panel acquisition software is written
in Python. ROS has tools to effectively communicate between
the multiple processes both simultaneously and sequentially.
ROS creates a publisher “talker” node that broadcasts a
message to either the flat panel detector when the system is
ready to acquire a radiograph. A master node tells the flat panel
detector node that is subscribed to an acquire_image topic
when to acquire an image. This topic tells the master node
whether to continue with the robot application if the image
acquisition has finished, or to wait if the image acquisition
device has not yet completed the exposure time or completed
the saving process of the radiograph.
J. Measuring Radiation Damage
Radiation exposure can result in damage to the robot and
other components. Radiation can cause energy deposition in
materials that can cause mechanical and electrical changes,
such as weakening to material structure due to vacancies and
irregularities and reducing semiconductor effectiveness due to
formation of ion pairs from deposited energy. Monte Carlo
tools like MCNP [27] can enable us to easily perform the
high-fidelity calculations necessary to determine the neutron
damage rate. High enough levels of neutrons or photons
will eventually affect the reliability of electronic components.
Thus, radiation tolerance is critical to the reliability of the
imaging process.
The sensitive components installed on advanced manipulators can be divided into three categories: 1) the drives (usually
electrical actuators with bearings, gear boxes and position
feedback devices); 2) the sensors (distance and force sensors,
and cameras); and 3) the cables and other communication
devices (including line drivers, multiplexing circuits, analog
to digital converters, radio links and even the preamplifiers
needed for some sensors). For each category, the radiation
hardening level required will depend on their location with
respect to the radiation sources (near the EEF or near gantry
tracks or walls) and on their frequency of use (e.g., a tool used
a small number of times, compared with protection systems
in use permanently) [28].
Papers detailing this work and the results can be found in
[29] and [30]. In summary, Displacement Per Atom (DPA)
rates in the SIA5 robot were calculated using MCNP. For
this work, the neutron spectrum was simulated and based on
the same beam port that the robotic system was deployed
in at UT Austin’s TRIGA Mark II research reactor. This
type of DPA rate calculation can be applied to determine
the radiation damage to robots and other objects in other
radioactive environments and applications. For comparison,
the DPA rates determined in this work were similar to those
found in thermal reactor materials [31], which is expected
since the TRIGA reactor is one.
III. R ESULTS
The previous section (methods and general design concept) provided detail on the methods used to determine the
7
reliability, safety, optimization, communication, and execution
of a robotic system for NDT imaging purposes. This section
delves into the details of the system implementation and uses
the methods presented in the previous section to perform
several experiments that test and validate the advantages of
using flexible automation for nondestructive imaging purposes.
The implementation of the system is described, showing how
the methods presented are used in conjunction with other hardware and software tools to provide a functional autonomous
radiography and/or CT system. Two main application areas
are discussed that show the robot performing neutron imaging
at a TRIGA Mark II research reactor at UT Austin and
X-ray imaging at a high energy X-ray source at LANL. This
application area demonstrates how the work presented in this
document supports the feasibility and necessity of a robotic
system for nondestructive imaging by exploiting the flexibility
of robotics to gain efficiency and adhere to ALARA principles.
All of the grasp, motion planning, and image acquisition
communication in the following applications are performed in
ROS, demonstrating how ROS simplifies integration of complex robotic systems under a common operating framework
and how ROS allows incorporation of recent research advances
into deployed systems.
A. Metrology
In depth repeatability and vibration tests have been performed on the robotic system used in this paper [30], [32]
and demonstrate that the robot’s repeatability and vibration
mitigation are sufficient for the required imaging tasks. For
the robot’s mechanical parameters assessed, the tests performed showed that the repeatability is 17 µm with a slight
dependence on speed. Faster robot movement speeds (100%
instead of 25% of v max , which is approximately 2500 mm/s for
linear moves [19]) result in slightly worse (less than 25 µm)
repeatability results. Having high repeatability allows the robot
to position the part in the beam in the same desired pose each
time. Yaskawa [19] states a repeatability value of 60 µm.
However, asserted capabilities from robotic companies tend
to be extremely safe, since Yaskawa’s declared repeatability
value is taken at maximum payload and maximum speed [33].
For actual hardware implementation, the robot’s velocity is
limited to 20% of its maximum velocity and acceleration
limit. This limitation is defined by the implementation in
this paper. Two types of vibrations that are explored in
Hashem et al. [30], [32] are static and tracking vibration,
with static vibration found to be within the 6-µm requirement.
Static vibration is the amplitude of impact of vibration on the
EEF position while the robot is not moving. Tracking vibration
is the amplitude of impact of vibration on the EEF position
while the robot is moving. Tracking, which is the ability to
follow the exact same EEF path, is also explored.
The repeatability of the robot can also be determined
using actual radiograph images. This test was performed
using a 225-keV microfocus with a Varian amorphous silicon
flat-panel detector (PaxScan 2520, Varian Imaging Products,
Palo Alto, CA). The Varian panel’s pixel pitch is 127 µm at 1:1
magnification. In ordinary X-ray radiography, the resolution
in the captured images is 25 µm for film and 100 µm–
This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.
8
1 mm for digital. The microfocus allows imaging down to
the 2–3 µm level. Therefore, it is important for the robot’s
repeatability and resolution to be on this level as well. The
distance between the source and object should be minimized
to increase magnification so that micrometer level motions can
be resolved. However, decreasing the source to object distance
increases focal spot blur. Lower energies can help reduce
image blurring while longer exposure times improve statistics.
Blurring in the captured images should be minimized so
one can distinguish the micrometer level differences between
images.
Ten radiographs were taken of a spherical ball bearing
(BB) attached to a plastic screw held by the robot. The robot
completed a repeatable motion (performing both joint and
Cartesian motions), with each image taken at the completion
of the repeated move. Images were compared in terms of
pixel intensity and location relative to other images. The
differences in pixel intensity relate to the differences in the
location of the BB. It was found that there was approximately
a 1.3 pixel difference in BB location along the x- and yaxes between the images. For this test, the source to object
distance was 171.45 mm and the source to detector distance
was 1701 mm, so the magnification or zoom factor was 9.92.
Therefore, the effective system pixel pitch was 12.8 µm (i.e.,
127 µm pixel pitch divided by 9.92). The corresponding
repeatability value would then be ±17 µm (i.e., 12.8 µm times
1.3 pixel value difference). This compares well to the ±17.9
µm repeatability value obtained using the dial indicator with
the robot moving at 25% maximum speed. It is important
to use the void setGoalTolerance()function in ROS
to set the error tolerance to a sufficiently low value. For
joint state goals, this will be the distance for each joint,
in the configuration space (radians). For pose goals this
will be the radius of a sphere (m) where the EEF must
reach.
Resolution for a robotic system is the minimal commanded
step for a joint. The resolution value is the smallest incremental
move that the robot can physically produce. To test the resolution of the robot’s EEF, the robot was commanded to move the
minimal step in Cartesian space by computing a mini trajectory
in ROS. The same microfocus system was used to acquire the
images as described earlier. An example test is shown in Fig. 8.
For the resolution test conducted, the distance from the source
to the object was 12.7 cm and the distance from the source
to the detector was 167.6 cm. Therefore, the magnification
was 13.2 and the effective system pixel pitch is 9.62 µm (i.e.,
127 µm divided by 13.2). The differences between the images
are shown in Fig. 9. The x-direction is facing up in the images.
There was approximately a 3.5 pixel difference between the
two images. This value needs to be divided in half to get the
actual distance traveled because the differences in both images
are highlighted. Therefore, the actual distance traveled was
1.75 pixel, which relates to a 16.8 µm resolution (i.e., 9.62 µm
times 1.75 pixels). The joint resolution of the robot will be
lower than this value since the resolution possible for the EEF
is a function of the joint resolution and the configuration of the
robot.
IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING
Fig. 8. Resolution test. The robot was commanded to move a minimal amount
in Cartesian space along in the positive X-directions while holding an object
with two BBs attached. Initial position (left) and final position (right). A scale
bar is shown in the top-left corner.
Fig. 9. Differences between BB locations on two resolution test images. The
dashed rectangle (not to scale) (left) shows the zoomed-in section with the
difference in pixels measured (right). Scale bars are shown for both images.
B. Neutron Imaging Measurements
To illustrate the advantages of using a robotic manipulator
with neutron imaging, mock-up depleted uranium fuel rods,
each consisting of five pellets prepared from urania (UO2 )
powder, were characterized by thermal neutron radiography.
To simulate cracks and voids resulting from irradiation and
burn-up in a fuel pin, tungsten and gadolinium inclusions were
embedded in the mock-up pellets. These rodlets contained
defects similar to those seen in irradiated fuel rodlets. They
can be used to establish sensitivity for density, visualization
of voids/cracks, and inclusions of different materials. The
SIA5 manipulator handled the fuel rods and provided advanced
and flexible motion capabilities that would be difficult to
achieve with linear and rotary motion stages. The goal is
to characterize irradiated fuel pellets, as well as to offer
better guidance for more expensive destructive examination.
By imaging fuel rods, one can see the effect of the development of irradiation and burn-up damage in nuclear fuel over
time. Since the technique is nondestructive, the time evolution
of fuel rod damage can easily be measured this way. With the
ability to predict how the composition and structural integrity
of fuel pellets evolve during their duration in a reactor, one
can improve the performance of nuclear modeling codes such
as MARMOT [34]. This will accelerate the understanding of
processes occurring during irradiation and ultimately improve
nuclear fuel.
This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.
HASHEM et al.: AUTOMATING HIGH-PRECISION X-RAY AND NEUTRON IMAGING APPLICATIONS WITH ROBOTICS
Fig. 10. Neutron transmission images of three mock-up urania fuel rodlets
with engineered flaws, including gadolinium and tungsten inclusions. The
black ovals indicate gadolinium inclusions, and the red oval indicates a
tungsten inclusion. A steel container encloses each of the fuel rods, and a
spring is seen in the top of each assembly. A scale bar is shown in the
top-left corner.
Neutron imaging consisted of a radiograph of three fuel rods
with the primary focus on the fuel pellet region. Exposure
times ranged between five and ten minutes for a single image.
Multiple experiments were conducted including the following.
1) Rotating each fuel rod to various orientations.
2) A vibration analysis (i.e., comparison of the resulting
radiograph with the robot holding and not holding a fuel
rod).
3) A repeatability test. This test consisted of taking an
image with the robot holding the fuel rod in a specified
location in the beam, moving the robot away from the
beam, bringing the robot back to the same specified
location, taking another image, and then comparing
the two images using MATLAB’s Image Processing
Toolbox [35]. From these comparisons, it was shown
that the repeatability in the fuel rod’s final location was
∼0.025 mm. More information on this test can be found
in Hashem [30].
4) A CT scan of one of the fuel rods. Neutron CT is a
process by which the 3-D neutron attenuation values
throughout the object are obtained. This process requires
taking 2-D neutron radiographs of the object while it is
rotated around 360°, with images taken at certain points
within the scan. Recon [36], a CPU-based reconstruction
algorithm that uses a standard Feldkamp filtered back
projection method, was used to reconstruct the 3-D map.
5) A helical scan, which is explained in Section III-C.
6) Radiographs of the fuel rods at various orientations.
Example results of the neutron radiographs of the three fuel
rods are shown in Fig. 10. Darker regions represent materials/
areas with higher neutron attenuation coefficients than lighter
regions. Gadolinium and tungsten inclusions appear darker
than the depleted uranium (d-UO2 powder). The gray regions
indicated uniform d-UO2 . Thermal neutron tomography identifies flaws in the composite pellets whereas areal density fits
9
Fig. 11. Subset of helical scan of a Maglite completed using the robotic
system coupled with the image acquisition system. The robot’s gripper can
be seen in the first three radiographs, however, it does not interfere with the
resulting radiographs since we are only interested in the bulb (top) portion of
the part.
to the 238 U demonstrate density uniformity. The gaps between
rodlets are visible in the radiographic images. Voids or chips
on the outside of the pellets are visible (appear black). These
radiographs clearly show the capability of the robot to perform
neutron radiography tasks. The integrated system has successfully demonstrated imaging of the mock-up uranium fuel rods
with the necessary precision and repeatability. These demonstrations serve as a proof-of-concept that flexible automation
and robotic technologies developed in research laboratories
can be valuable for advanced nondestructive imaging abilities
and applications.
C. X-Ray Imaging Measurements
A seven-axis robot arm can be configured for any number of imaging scan types, including CT, helical scans, and
in-motion radiography [13]. Thus the robotic system is an allin-one system to perform different X-ray imaging scan types.
An example of this is shown in Fig. 11, where X-ray images
were taken of a helical scan of a Maglite [37] performed
with the robotic system. In order to perform the helical scan,
the robot simultaneously translated the part vertically and
rotated the part at each step in the scan. Helical scans are
implemented to decrease scan time when the imaged object
is longer than the detector’s field-of-view. It is also useful
when using a point source, where the source strength is not as
strong near the top and bottom of the image. The capabilities
and application process for helical scans are well documented
by Silverman et al. [38]. The six images below in Fig. 11
(left to right) show the progression of a helical scan of the
Maglite. These images are just a subset of the entire helical
scan from the beginning to end of the scan (rotated from
0° to 360° and translated 7 cm in total). Each progressive
image in the radiographs below show the part rotated 51° and
translated 1.17 cm. The two-fingered Robotiq gripper is visible
in the first five images but does not obtrude the critical features
of the Maglite.
This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.
10
IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING
Fig. 12. Reconstructed results of a CT scan of a Maglite obtained with
using the robot and rotary stage to rotate the part. A cross section slice of
each method is shown below the respective radiograph. A scale bar is shown
in the top-left corner.
Fig. 13.
Heat-source radiographed in normal configuration (left) and
radiographed in “pole shot” configuration (right).
A CT scan of the Maglite was also performed using the
robotic system as well as a standard rotary stage [39] for
comparison. Fig. 12 shows the reconstructed CT images and
cross-sectional slices of the part that were acquired using
both systems. The scan performed using the robotic system
consisted of a 360° scan with 600 projections. A similar scan
was acquired using the rotary stage to rotate the part; however,
1000 total projections were acquired. The same number of
projections could have been acquired using the robotic system,
but was unnecessary to see that the results are comparable.
A mock heat source, approximately 1 in in diameter, was
also radiographed using the 225 keV microfocus X-ray source
to demonstrate the system’s high precision capability. There
are many instances in radiography where specific views of
a part are needed. One common view is a pole-shot of an
object, where the beam goes through the part from the top
to bottom. For example, a pole-shot radiograph of the heat
source is seen in Fig. 13. This radiograph was acquired using
the robotic system as shown in Fig. 14. With this system, one
can acquire various views and angles of a part, and even place
the part back down and pick it up in a different orientation
if a specific view such as a pole-shot is required. Without
the robotic system, the operator would have to shut down the
radiation source and manually reposition the part. For some
specific shots, such as pole-shots, a fixture may even need to
Fig. 14. Pole-shot demonstration. (a) Robot picking up the heat source from
the top and (b) placing it in the X-ray beam in a regular orientation. (c) Robot
then places the part back down, (d) approaches new pickup position, and
(e) picks it up from the side. (f) Heat source can then be positioned in the
beam to get a pole-shot radiograph.
Fig. 15. Precision alignment capability of the robot when a part’s weld
needs to be aligned with the detector. The radiograph on the left shows the
weld not aligned and the radiograph on the right shows the weld aligned with
the detector. The intensity transmission plots are shown to the left of their
respective radiograph.
be machined to hold a part in place so that it does not move
while imaged.
This work focuses on asking industrial robots to do flexible
tasks on a small scale with the precision required for nontrivial
NDT tasks in a DOE complex. A demonstration of this capability is shown in Fig. 15, which shows the X-ray radiographs
and intensity transmissions of the heat source. The weld in
the heat source must be perfectly aligned with the detector
This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.
HASHEM et al.: AUTOMATING HIGH-PRECISION X-RAY AND NEUTRON IMAGING APPLICATIONS WITH ROBOTICS
that, when performed manually, is a time-consuming and
frustrating exercise in trial-and-error. The intensity transmission histograms are used to align the part autonomously.
On the left of Fig. 15, the two X-ray intensity transmission
peaks marked are produced as X-rays penetrate the weld
area of the heat source: one for the front side of the weld
and one for the back. The peaks result as the X-rays are
attenuated to a lesser extent where the heat source is welded
together as the weld appears in the radiograph as a material
of slightly different density. To properly align the heat source
so that the imaging system only sees a single peak of X-ray
intensity transmission, the robotic system must align the part
as shown on the right of Fig. 15. To accomplish this task,
a feedback loop is made between the robotic system and the
imaging system that continuously takes digital radiographs in
real-time. The robotic system, adjusts the part until a single
X-ray intensity transmission is observed in the region of
interest. When the weld is perfectly aligned, the two peaks will
converge into a single peak. A real-time positioning system
with digital feedback allows this alignment to be achieved
quickly without manual repositioning of the part.
IV. C ONCLUSION
We have demonstrated the utility of using a robot for
positioning of radiography samples to reduce worker exposure
and achieve real-time, arbitrary six-axis sample positioning.
The performance gains achieved by the robot enable advanced
radiography techniques such as helical scanning, CT, and
alignment of welded components. These demonstrated using
an industrial robot deployed in both a nuclear reactor beam
port and a high-energy X-ray bay. This effort is a collaboration
between AET-6 at LANL and UT Austin. The use of robotics
and automation reduces personnel dose, improves precision
alignment, and improves throughput. Automation has the additional potential benefit of improving part throughput by obviating the need for human personnel to move or exchange parts to
be imaged and allowing for flexible orientation of the imaged
object with respect to the X-ray or neutron beam. We have
presented a new approach for X-ray imaging using robotics
that opens the door for new possibilities for NDT imaging.
The ability to have six DoF control while keeping alignment
repeatability tolerances to less than 20 µm means that configurable imaging scans are not only definable but maintainable.
The configurability allows for an automated imaging system
that has the same functionality as multiple separate imaging
systems and one that can accommodate a large variation of part
form factors. Furthermore, the linear and rotation positioning
capabilities needed for magnification changes and part tilt can
also be achieved with this approach. The use of commercially
available robotics allows for scaling of this system relatively
easily as there is consistency in robot controllers for a large
variety of robot models that support ROS [25]. We have shown
that the stable positioning for X-ray imaging can be performed
to at least the required 25-µm absolute repeatability level
defined by LANL engineers. This positional error may be
further reduced by use of a laser tracker that monitors and
records positions for fine robot position correction as shown
by Gordon et al. [41].
11
The system can run autonomously as well as in teleoperated
mode, and it gives technicians the ability to position samples
in real time, in situ, and arbitrarily in 6-axes without entry
into potentially high radiation areas. The ability to arbitrarily
position a sample in space ensures that the radiographer can
obtain exactly the image they need without the need to design
and fabricate new fixtures or tools. The estimated time saved
over manual operations is at least 3 min/view. For example,
if 1000 parts are imaged, 50+ h of labor is eliminated. This
does account for the time it takes for reentry into radiation
areas and inevitable small motion adjustments. The robotic
system also extends operation and the use of multiple shifts.
Future work will focus on improving the feedback loop
between the robot and CT processing algorithms, which allows
for additional imaging techniques, applications, improved radiograph quality, and reduced throughput time. For example,
this system can achieve evenly spaced views around a sphere
autonomously. Also, after an initial radiograph is acquired,
the CT processing algorithm (or operator) could request
additional views to reduce error and achieve more optimal
radiographs. Sample holder trays to hold and organize parts
will be printed via additive manufacturing or fixture free
grasping systems under development in a parallel effort can
further reduce integration efforts. Gripper interlocks and grasp
validation techniques can prevent the release of a part except
in designated process locations for additional safety measures
if needed.
R EFERENCES
[1] L. Cartz, “Nondestructive testing,” ASM International, Materials Park,
OH, USA, Tech. Rep., 1995.
[2] G. S. Sundar et al., “Design and developments of inspection robots in
nuclear environment: A review,” Int. J. Mech. Eng. Robot. Res., vol. 1,
no. 3, pp. 400–409, 2012.
[3] U.S. Dept. Energy, “Environmental restoration and waste management
robotics technology development program, robotics 5-year plan,” DOE,
Washington, DC, USA, Tech. Rep. DOE/CE-0007T, 1990, vol. 3.
[4] K. Nagatani et al., “Emergency response to the nuclear accident at the
Fukushima Daiichi Nuclear Power Plants using mobile rescue robots,”
J. Field Robot., vol. 30, no. 1, pp. 44–63. 2013.
[5] T. Sattar, “Robotic non-destructive testing,” Ind. Robot, Int. J., vol. 37,
no. 5, pp. 102–116, 2010.
[6] R. Bogue, “The role of robotics in non-destructive testing,” Ind. Robot,
Int. J., vol. 27, no. 5, pp. 421–426, 2011.
[7] J. Shang et al., “Design of a climbing robot for inspecting aircraft wings
and fuselage,” Ind. Robot, Int. J., vol. 34, no. 6, p. 495, 2007.
[8] G. Dobie, R. Summan, S. G. Pierce, W. Galbraith, and G. Hayward,
“A noncontact ultrasonic platform for structural inspection,” IEEE
Sensors J., vol. 11, no. 10, pp. 2458–2468, Oct. 2011.
[9] A. Ahrary et al., “A study of an autonomous mobile robot for a sewer
inspection system,” Artif. Life Robot.. vol. 11, no. 1, pp. 23–27. 2007.
[10] P. Louviot, A. Tachattahte, and D. Garnier, “Robotised UT transmission
NDT of composite complex shaped parts,” in Proc. 4th Int. Symp. NDT
Aerospace, Berlin, Germany, 2012, pp. 1–8.
[11] Erhardt+Abt. (2016). HeiDetect FlexCT, Heitec. [Online]. Available:
http://www.erhardt-abt.de/en/heidetect-flexct.html
[12] D. Mery, Computer Vision for X-Ray Testing. Cham, Switzerland:
Springer, 2015.
[13] R. Prakash, Nondestructive Testing Techniques. Tunbridge Wells, U.K.:
New Age Science, 2009.
[14] A. W. Davis, T. N. Claytor, and M. J. Sheats, “High-speed data acquisition for three-dimensional X-ray and neutron computed tomography,”
in Proc. 44th Annu. SPIE Meeting, Denver, CO, USA, 1999, p. 246.
[15] D. R. Janecky et al., “WANTO LANL site report,” WANTO, Livermore,
CA, USA, Tech. Rep. LA-UR-11-006214, 2011.
This article has been accepted for inclusion in a future issue of this journal. Content is final as presented, with the exception of pagination.
12
[16] UniWest. (2011). Uni-Versal Robotic Eddy Current Test Machine.
[Online]. Available: http://www.uniwest.com/Uni-Versal-Robotic-EddyCurrent-Test-Machine-P14.aspx
[17] D. W. Bosserman. (Aug. 2007). The Future of NDT: Radiography Meets Robotics Quality Magazine. [Online]. Available:
http://www.qualitymag.com/articles/84931-the-future-of-ndtradiography-meets-robotics
[18] S. Niekum et al., “Learning grounded finite-state representations from
unstructured demonstrations,” Int. J. Robot. Res., vol. 34, no. 2,
pp. 131–157, 2015.
[19] Yaskawa. (2012). SIA5D. [Online]. Available: http://www.motoman.
com/datasheets/SIA5D.pdf
[20] Robotiq. (2012). Adaptive Robot Gripper 3-Finger. [Online]. Available:
http://robotiq.com/products/industrial-robot-hand/
[21] Robotiq. (2015). Adaptive Robot Gripper 2-Finger 85. [Online]. Available: http://robotiq.com/products/industrial-robot-gripper/
[22] J. J. Craig, Introduction to Robotics: Mechanics and Control, 3rd ed.
Englewood Cliffs, NJ, USA: Prentice-Hall, 2004.
[23] PaxScan 2520D/CL Amorphous Silicon Digital X-Ray Imager,
V. I. Products, Palo Alto, CA, USA, 2011.
[24] I. I. Gnezdilov et al., “Optimization of the neutron detector design
based on the 6LiF/ZnS(Ag) scintillation screens for the GAMMA400 space observatory,” in Proc. Conf. Fund. Res. Part. Phys., 2015,
pp. 199–205.
[25] ROS. (2015). Robots Using ROS. [Online]. Available: http://wiki.ros.
org/Robots
[26] M. Quigley et al., “ROS: An open-source robot operating system,” ICRA
Workshop Open Source Softw., vol. 3, nos. 2–3, pp. 1–5, 2009.
[27] D. B. Pelowitz, “MCNPX user’s manual Ver. 2.7.0,” Los Alamos Nat.
Lab., Los Alamos, NM, USA, Tech. Rep. LA-CP-11-00438, 2011.
[28] L. P. Houssay, “Robotics and radiation hardening in the nuclear industry,” Ph.D. dissertation, Dept. Nuclear Eng., Univ. Florida, Gainesville,
FL, USA, 2000.
[29] J. Hashem et al., “Theoretical neutron damage calculations in industrial
robotic manipulators used for non-destructive imaging applications,”
Prog. Nucl. Energy., vol. 94, pp. 71–79. Sep. 2017.
[30] J. Hashem, “Automating x-ray and neutron imaging applications with
flexible automation,” Ph.D. dissertation, Dept. Mech. Eng., Univ. Texas
Austin, Austin, TX, USA, 2015.
[31] H. Heinisch, L. R. Greenwood, W. J. Weber, and R. E. Williford,
“Displacement damage in silicon carbide irradiated in fission reactors,”
J. Nucl. Mater., vol. 327, nos. 2–3, pp. 175–181, May 2004.
[32] J. Hashem, J. Hunter, and M. Pryor, “Automating X-ray and neutron nondestructive testing applications,” in Proc. ANS Winter Meeting, vol. 109.
Washington, DC, USA, 2013, pp. 517–520.
[33] Manipulating Industrial Robots—Performance Criteria and Related Test
Methods, document ISO 9283, International Standards Organization,
1998.
[34] Idaho National Laboratory. (Nov. 23, 2015). MOOSE Simulation Environment, INL Fact Sheets. [Online]. Available: http://www4vip.inl.gov/
research/moose-applications/
[35] MATLAB, Image Processing Toolbox User’s Guide, MathWorks, Natick,
MA, USA, 2015.
[36] E. S. Jimenez, L. J. Orr, and K. R. Thompson, “High performance
graphics processor based computed tomography reconstruction algorithms for nuclear and other large scale applications,” Sandia Nat. Lab.,
Albuquerque, NM, USA, Sandia Rep. SAND2013-8059, 2013.
[37] Maglite. (2014). Maglite Solitaire LED 1-Cell AAA Flashlight, Maglite.
[Online]. Available: http://robotiq.com/products/industrial-robot-hand/
[38] P. M. Silverman, C. J. Cooper, D. I. Weltman, and R. K. Zeman, “Helical CT: Practical considerations and potential pitfalls,” Radiographics,
vol. 15, no. 1, pp. 25–36, 1995.
[39] Newport Corporation. (2015). Newport, Compact Rotation Stage.
[Online]. Available: http://search.newport.com/?x2=sku&q2=PR50CC
[40] Radiation Safety Manual for Use of Radioactive Materials, ALARA
Program, Washington Univ. St. Louis, St. Louis, MO, USA, 2014.
[41] J. A. Gordon et al., “Millimeter-wave near-field measurements using
coordinated robotics,” IEEE Trans. Antennas Propag., vol. 63, no. 12,
pp. 5351–5362, Dec. 2015.
[42] S. I. Chitta, A. Sucan, and S. Cousins, “MoveIt! [ROS topics],” IEEE
Robot. Autom. Mag., vol. 19, no. 1, pp. 18–19, Mar. 2012.
[43] MATLAB, Robotics System Toolbox, MathWorks, Natick, MA, USA,
2016.
IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING
Joseph A. Hashem received the Ph.D. degree in
mechanical engineering from the University of Texas
at Austin, Austin, TX, USA, in 2015, where he
focused on robotics and nuclear engineering.
From 2010 to 2015, he was a Research Assistant
with the Nuclear Robotics Group, The University
of Texas at Austin, where he designed dynamicallydeployed robotic systems for use in confined,
hazardous environments and also taught and coordinated the Neutron Shielding Laboratory for both
undergraduate and graduate students for a health
physics laboratory course. He is a R&D Engineer with the Nondestructive
Testing and Evaluation Group, Los Alamos National Laboratory, Los Alamos,
NM, USA.
Dr. Hashem received the Sally Blum Memorial Prize for Excellence in
Design in Mechanical Engineering and the Hamilton Undergraduate Research
Scholar from Southern Methodist University. He was a recipient of the Best
Paper Award at the Third International Topical Meeting on Robotics and
Remote Systems.
Mitch Pryor is a Research Scientist with the University of Texas at Austin, Austin, TX, USA, where he
co-founded the Nuclear Robotics Group (NRG). The
NRG develops assistive and automated technologies
for use in hazardous environments, including D&D
sites, nuclear material storage facilities, and for
glovebox manufacturing.
Sheldon Landsberger is a Professor in the Nuclear
and Radiation Engineering technical area in the
Mechanical Engineering Department. He has served
on the faculty of the Cockrell School of Engineering
since 1997. He has published more than 220 peerreviewed papers and more than 160 conference proceedings mainly in nuclear analytical measurements
and their applications in nuclear forensics, natural
radioactivity, environmental monitoring of trace and
heavy metals, and use of radiation detectors on
robots and other mobile systems.
James Hunter is a Research and Development
Scientist at the Los Alamos National Laboratory,
Los Alamos, New Mexico, USA, where he is a
member of the Non-Destructive Testing Group and is
team leader for the special engineering team. Since
starting as a student at LANL 17 years ago, he has
written much of LANL’s tomography reconstruction
code and has worked with a range of X-ray systems
from small micro-focus cabinets to a 6-20 MeV
Microtron based system. His current work involves
both production and R&D imaging on a huge range
of components with both lab-based X-ray systems as well as synchrotron beam
lines and neutron imaging facilities. His current work involves both production
and R&D imaging on a huge range of components with both lab-based X-ray
systems and synchrotron beam lines and neutron imaging facilities.
David R. Janecky received the A.B. degree in geology from the University of California at Berkeley,
Berkeley, CA, USA, in 1975, and the Ph.D. degree
in geoscience from the University of Minnesota,
Minneapolis, MN, USA, in 1982.
Since 1984, he has been a Scientist and the Manager at the Los Alamos National Laboratory, with
research interests in dynamic geoscience, industrial,
and engineered systems, including nondestructive
testing and evaluation, environmental remediation,
global security, and support for the nuclear weapons
stockpile.