Module - 3 - Computer Considerations For Robotic Systems
Module - 3 - Computer Considerations For Robotic Systems
Computer Considerations
for Robotic Systems
7.0 OBJECTIVES
7.1 MOTIVATION
The use of computers and
computational
the presence of the brain in an
elements in robotic systems is
essen
just as
intelligent animal is esential. In this chap
510
Sec. 7.2 Architectural Considerations 511
h planning is a method whereby the path or trajectory of the end effector is computed from
in a stro about its current position, where it is supposed to go, how it is supposed to get there (e.g.
t ine), its speed, and other eriteria defined by either the user or external sensors.
for Robotic Systems
512 Computer Considerations
Chap.7
A descriptive discussion of operating systems, multitasking, distributed
cessing, multiprocessors, bus structures, and robotic considerations will be npro
sented in this section.
pre
7.2.1 Operating Systems
MSDOS: developed for the IBM family of personal computers and compatible
products; provides a development and execution environment for a single
user; provides the support and availability of numerous languages, compilers,
assemblers; provides numerous utilities for file manipulation, directory ma-
nipulation, networking.
UNIX: initially developed by AT&T primarily for use within its corporate
structure, it was soon made available for a wide variety of computing env-
onments from very
The
large computersystems to very small microcomputc
and multiuser environments with a
system supports single-
range of system utilities, languages, and communications support.
very wide
has been adapted to execute on a very
UN
large number of different compurei
manufactured by many companies. It also provides facilities for transporting
software among many different computer environments. Generally,
is not suited to real-time applications, but there are
U
variants, s0-called i
time UNIX, that are suitable for real-time operation.
Operating systems provide the "hooks" for both programmers and
prog
ram-
speed up the development process and reduce the learning curve potential
of
can
features such as file management, batch file generation, and on-line
users, since
debugging tools are available.
with
Initially, since special microprocessor architectures were designed for use
to the ad-
each particular application, it was difficult to introduce the neophyte
robots that utilize personal computers
vantages of robotic technology. Recently,
as their master
controllers have become commercially available. These systems
educational environ-
facilitate the implementation of the robot in the industrial or
ment, since the time required for learning may be greatly reduced due to the general
familiarity personal computer operating system and hardware. An ex-
with the
ample of this type of robot is the RTX, a SCARA-type robot arm. This robot
can be interfaced directly with an IBM PCIXT or compatible. Installation is simple,
requiring only cable connection from an R$232-C port of the PC to the control
a
robot is equally simple since it may be
port of the robot. Programming*" of the
or it may be
manually driven using the cursor control keys as a teach pendant,
programmed through software using standard programming languages. A library
interface exists for the PASCAL.
high-level language,
There are other proprietary operating systems available, too numerous to
environments have many, if not more than the features
list, but in their own
discussed above.
7.2.2 Multitasking
Multitasking is attribute of operating systems that permits the execution and man-
an
7.2.4 Multiprocessors
Type Originator
IBM PC bus IBM
Multibus Intel
Multibus II Intel
VME bus Motorola
STD bus
IEEE 488 bus Hewlet-Packard
Q-bus Digital Equipment Corp.
Unibus Digital Equipment Corp.
In addition to these attributes, there are physical and mechanical issues, such
as size of the bus, the type of connector, the electro-magnetic radiation properties,
ruggedness, ability to withstand shock vibration, thermal shock, radiation, and so
on. The variety of considerations is too large to cover fully in this text. but it
should be understood that the issue of bus structureisas dynamic as the evolution
of computer architecture itself, and the lifetime of a bus structure affects choices
of components as well as basic system design considerations.
Currently, there are no standardized bus structures for robots. for this reason
it is virtually impossible to interface one manufacturer's hardware with another.
This should be contrasted to modern computer manufacturing where interface
standards exist and are used to interconnect different vendors hardware and soft
ware. The lack of a standard has hampered the growth of the robotics industry
Standards have been proposed by the SME and the IEEE, which if adopted, will
begin to rectify the situation.
Virtually every robot manufactured today has at least one computer within it. The
simplest robot relies on the ability to control data flow and formats (protocols) to
some degree and therefore has some sort of logical processing unit. This is gen-
erally considered to be a computer, and when so integrated may be a special-
purpose hardware computational or control device. Nevertheless, it is still con-
sidered a computer, even though it may have no programmable features. Despite
the fact that a computer was used to implement the functions required, its pro-
gramming remains essentially fixed and consequently cannot be changed. Al
though certain options for its operation may be selected by the user by setting some
switches or from a terminal, the sequences of operations remain as the designer
originally chose them. The flexibility will not be compromised, however, within the
design envelope, as one can program a computer to perform complex calculatious
even though the programmer cannot change the basic set of machine instructions.
More powerful robots must have the ability to perform coordinate transto
mations and/or straight-line coordinated motions. As a consequence, the co
putational tasks become significant and the computational power required inerease
correspondingly. Also, as sensor inputs and real-time signal processing aree
quired of the robot in the future, the computational burden will increase even o
drastically and the addition of more computational elements may wel becos
Sec. 7.4 Computational Elements in Robotic Applications 517
of binary correlation. This is a process that is time consuming and requires several
nested loops of high order. Performing these using computer language instruc
at least a second for
tions, even with very fast 32-bit microprocessors still requires
A specialized ALU may require only a few hundred
parts having a moderate area.
milliseconds and may cost only a few thousand dollars.
discussion implies that each element has the ability to arbitrarily (in time) assert
its data.
for Robotic Systems
520 Computer Considerations
Chap.7
ACK
B A
Figure 7.4.1. ACK/NACK pairing of
REQ states.
ACK a'
B A
REQ=b
ACK=b'
State Sequence
1 2 3 4 5 67 8
o 1 1 0 0 0 0 0
secure hand-
a
0 0 1 1 0 0 0 0 Figure 7.4.2. Example of
columns in the table inu
O 0 0 0 1 10 0 shaking. The
b The numbers
cate the system state.
in fact the outgong mal was picked up and mail deposited in the box at the same
time.
Clearly, the use of a red and blue flag in a so-called ACK/NACK pairedd
scheme would eliminate this confusion (as is illustrated in Figure 7.4.2) since the
carrier would raise the blue flag if mail was deposited, and would drop the red flag
to indicate that the mail carrier had been there.
This type ot handshaking is simple and secure, but also requires a great deal
of overhead, since every communication must utilize the concept of flags. Tech-
niques for exchanging information packets reduces some of this overhead, at the
expense of not guaranteeing that every bit of information is sent and/or received
only when the receiver/transmitter is certain to be ready. Packeted information
transfer allows for error checking on a relatively infrequent basis and detects certain
types of errors in transmission so that at the very least, errors can be logged and
appropriate action taken. Such action may be a request for retransmission, or the
data could even be ignored.
Figure 7.4.3 illustrates the packeting of information using a simple schemne
to transmit a message of N bytes of information. This is known by many different
terms, but the SECS1 (Semiconductor Equipment Communication Standard) pro-
tocol designates this technique as the "Data Link Protocol." The idea is simple
in that first, the receiver is informed as to the length of the message (i.e., how
many bytes will be transmitted). This is then followed by the message itself, and
finally, a quantity called the checksum is transmitted. With such a format, the
integrity of the message is preserved and errors in transmission detected.
Clearly, the key to the protocol is the checksum (which is also known as
longitudinal redundancy check, or LRC). Normally, the checksum is the negative
of the sum of the binary-coded numeric values of the message and is usually
truncated to one or two bytes. As long as the structure of the message is known
by the receiver, an error in transmission can be detected by computing a local
checksum and then comparing it to the transmitted value. This technique will
detect the occurrence of single-byte transmission errors.
It is important to understand that it is possible, although unlikely, to have
multiple byte errors that will still produce the same checksum. If one wants to
prevent this situation from occurring, more complicated error-checking techniques
must be used, such as CRC (cyclic redundancy checks) that can detect multiple-
byte errors. However, these require the evaluation of polynomial error formulas
(as compared to the simple linear sum of the LRC) and are more time consuming.
Moreover, it is usually not found to be necessary for most applications, except
where there is the distinct possibility of noise-corrupted transmission.
In robotics applications, communication between slave and master processors
Would probably use LRCs to ensure accurate information up and down the infor
mation chain. In this respect, robotic communications parallels other multi-pro-
cessing applications, requiring secure communications and needs no extraordinary
techniques for implementation.
Receive ldle Loop Send
--- - Read and Send are Flags.
T3 is Set When a Read
Idle is Required
tis a time counter
Listen Length (N
N B N
Length
ReceiveN Bytes N Bytes
Upper check 6S
SendUpper Check
Lower check Send Lower Check
=0
No EOT
No
NO Data Link Protocol Parameter Ranges
es
Character and Resolutions
Completion Parameters Minimum range Resolutions
0.1- 10 sec. sec.
0.2- 25 sec. sec.
Message Read Failure) (Send Failure No ACK T3 1-120 se.
Received SeC
(Contention RTY -31
Send Later Yes
Send ACK | Send NAK
Message Sent
Return to
Idle
In addition to the roles just described, more classical calculation roles may be
attributed or assigned to computer components in a robotics system. One is that
of performing a variety of coordinate transformations as will be developed math-
ematically in Chapter 8. Such transformations are necessary to develop drive
signals for the control portions of the robot. For example, moving a gripper or
manipulator from one point to another typically starts with specification of motion
in a rectilinear or Cartesian coordinate system. However, to achieve the desired
motion, these coordinates must be transformed into the specific joint space of the
robot (e.g., Cartesian, cylindrical, spherical).
Usually, these transformations are mathematically complicated and require
transcendental function evaluations. Consequently, some type of relatively so-
phisticated mathematical processing is required. This can be accomplished in a
number of ways, including the use of software routines, hardware evaluation uti-
lizing a floating-point processor, or employing software lookup tables. In a robot,
the decision as to which technique to use is tied to the final system cost, speed,
implementation, expandability, and generality.
Another major area where classical computer calculation-type functions are
involved in a robot is in signal processing, e.g., noise removal from a distorted
signal is a common requirement in sensor data analysis. Signal processing may
be accomplished in both the analog and the digital worlds, and may be multidi-
mensional. That is, there may be multiple lines of data coming in from the outside
world in the form of binary input/output or in the form of analog or continuous
signal input and output. An example of this is the processing of an ultrasonic
acoustic signal from the outside world to determine position information, or perhaps
to monitor acoustic emissions from a variety of electromechanical components such
as motors, gears, and so on. In addition, metal surfaces scraping against other
metal surfaces may produce acoustic emissions that are detectable and may be
useful to the robot controller for preventative maintenance scheduling
Another example of complex computational needs is in the field of vision,
environment relative to the robot
whereby one may be inspecting an outside world
Torthe purposeof for example, in palletizing objects, alignment of
alignment:
integrated circuit chips or in the alignment of surface-mount components on printed
Circuit boards. These tasks are relatively computationally heavy and will generally
these functions tin a timely and
require a dedicated processor for implementing
efficient manner. For example, o n e may need to direct the robot to position a
Camera so that it may i"see" the environment. This information may then be
the alignment offsets, passes that
passed to a vision processor which calculates a controller, and then to the trans-
ntormation over to the robot perhaps throughthe robotic computer. This
of permits
rmational pathways or internal routines velocity and acceleration control
information translated into specific
be
ne to
signals.
524 Computer Considerations for Robotic Systems
Chap.
Another possibility for using a vision system to augment the robot's sense.
of
the environment would be to accept or reject parts, or to characterize or grade
them. An example of this might be in a microchip dicing system, whereby one ic
inspecting a matrix of semiconductor components on a diced wafer. These wafers
are often marked with ink dots to indicate rejection, and the robot may simplv
pick and place the good components into acceptance bins or packages. The poorer
quality components or those that have been rejected may be either left on the
wafer carrier or may, in fact, be taken off the carrier and depOsited into a reject
bin. A refined classification of this application would permit multiple ink dots or
multiple coding of the surface of the chip so that one might grade parts into a
variety of different categories. In this manner, one could fabricate variable quality
assemblies by inspection, ranging from those with the highest down to those with
the lowest.
In addition to the vision and coordinate transformation tasks, calculation is
required in the area of direct axis (or joint) control. For example, if one is using
a servomotor to drive a robotic axis, there are a variety of ways to accomnplish this
task. As discussed in Chapter 4, a digital-to-analog(DA) converter could be used
to drive the servo amplifier directly. In general, the calculation of the required
drive signal is not trivial, and in fact the output will usually have to be shaped
rather precisely in order to produce the desired robot performance (i.e., smooth,
vibration-free motion). The so-called "on/off" or "bang-bang" control systems,
whereby the input to a servomotor is a step of a known value, is a relatively
straightforward control procedure. However, step signals, as explained in Chap-
ters 3 and 4, will introduce high values of derivatives of position, velocity, and
acceleration creating untoward effects in the output (e.g., excessive mechanical
vibration). The obvious need to profile motions throughout space and to control
axes simultaneously makes the problems associated with axis control computa-
tionally intensive.
In addition to the direct output requirements, position and/or velocity feed-
back information must be acquired and utilized. The acquisition of real-time
information may be computer resource intensive, since some of those signals may
have to be filtered digitally. Additionally, making use of the feedback signals to
compute new positions, velocities, and accelerations may also present a large com
putational burden. Although not currently done, in the not too distant future
external sensory data will be fed back to the master processor and will be used to
modify the set points sent out to the joint processors. In effect, the system will
then be recomputing the axis transformations to produce the desired manipulator
motions. The role of the computer in this environment is therefore more or less
traditional. Appendix C shows the computational algorithms necessary to accon
plish these tasks.
Additional computational complexity is introduced by requiring coordinateu
motion control whereby all of the robot's joints must start and stop at the san
time. A further level of computational difficulty results when the required motio
must be in a straight line in three-dimensional space. There will invariably be
Computational Elements in Robotic Applications 525
Sec. 7.4
fairn they may be graded and placed into bins. This application requires a
r y complicated cell controller with use of common protocol with common lan-
Euages throughout the Figure
system (See 7,4.4.).
n addition to these types of coordination, the robot computer or computing
ments may need to communicate with CAD, CAM, or CAE data bases. That
One may have designed and simulated a specific assembly process on a host
526 Compu ter Consid eration s for Roboti c System s Chap. 7
1
□□□
□□0000,...,..,,--,,~ /
r Component
Feeder (10)
ODO
00
00
0
0,
36" □□□
CCTV Monitor
Tool Changer
Assembly
(7 Tools)
Substrate
Feed System Control Panel
Figure 7 .4.4. Schema tic of a hybrid circuit assembly system that utilizes
a robotic manip-
ulator. The system utilizes a commo n protoco l and languag e and require
s a complic ated cell
control ler.
compu ter. This assembly progra m may have been downl oaded to a variety
of
compu ters, including the cell controller, for example, which may then be requir
ed
to direct the assembly operat ion, including all the roboti c manip ulator s so that
the
produ ct is assembled. Note that in this case the robots were never taught directl
y,
but obtain ed their "prog rams" electronically. Altho ugh this is not yet a widesp
read
practi ce with roboti c systems, it is one area in which to expec t develo pment
s to be
made.
In additi on to the roles of the computers described above , there is the concep
t
of coord inated path planning for the robot motions. For examp le, path planni
ng
might requir e that the work be moved in a prescr ibed manne r throug h a
specific
set of workstations. This might requir e the integr ation of a numb er of
robotic
manip ulator s, perhap s incorporating information from a vision system as
well as
from other sensors.
As an examp le, we may look at the produ ction of a wiring harness, whic_h
requir es the stringing of wires of various lengths throug hout a specific geome
tric
patter n in space. Normally, wires will have to be routed aroun d pins locate
d on
the harnes s board . Thus one must coordi nate the stringing of a specific wire
based
on previo us ones that the robot has installed. Furthe rmore , the work that
the
robot has compl eted may not be stable after installation. For examp le,
when
Real-Time Considerations 527
sec, 7.5
stringing a wire, there may be a curl or misposition of the wire after the robot
releases the end of the wire. Then, when the robot goes back to place the next
one, it must have some way to guarantee or to measure the placement of these
previously laid wires so that the harness is produced in a reliable fashion.
_ REAL-TIME CONSIDERATIONS
75
In this section we discuss two important topics of real time event-driven processes
and sensor information handling. The concept of "real time" is best thought of
as "needed now." This needed-now concept gives the idea of urgency to the topics
in this section, since either type of processing is so time-critical that if either process
cannot be served as soon as possible, the robot and its environment may subse-
quently be uncontrollable, probably with catastrophic consequences (e.g., the robot
may become unstable).
In many software applications, a computer must respond to input from the "outside
world." Two methods of achieving this are:
• Program driven
• Event driven
Program-driven response implies that the input occurrence is expected in some
sense. Entry from a keyboard is usually of this type, whereby a program is waiting
for an input via a keystroke. Although the program does not know what the
response will be, it does know that if there is a response, it will occur at a specific
location in the program. Another example of such a response is that caused by
a switch closing, indicating that a robotic gripper has successfully acquired a part.
Here, the robot is expecting the closure of the, gripper at a specific point in its
program sequence. This is similar to a program waiting for an input of data prior
to executing a calculation.
The above should be contrasted to an event-driven process, where the timing
of the response as well as the type may be totally unpredictable. As an example,
consider a pedestrian walking up to a busy street that has a pushbutton-activated
street light. If the button is pushed, the traffic light controller will respond in time
by changing the light to a yellow-then-red condition for traffic, and eventually to
green for the pedestrian. If the pedestrian never pushes the button, the light will
~ever change, and the internal "brain" will perform its normal tasks, keeping the
hght green for traffic, checking whether all the lamps are functional (by checking
~urrent through the filaments), and calling the traffic department to replace a bulb
if necessary.
528 Computer Consideratio ns for Robotic Systems Chap. 7
EXAMPLE 7.5.1
To illustrate the ideas above, consider a simple filtering operation for noise
reduction. The following equation represents a simple single-pole low-pass
filter:
G(n) = AG(n - 1) + (1 - A)F(n)
where F(n) - input sequence from an AID converter
G(n) - output sequence
A filter weight (0 < A s; 1)
If A = 0, the input will pass directly to the output, and thus the filter will
behave as an all-pass device. As A approaches ~nity, th~ filter properties
will approach those of an ideal integrator. For mtermediate values of A ,
low-pass filter characteristics result.
530
Com pute r Con side ratio ns for Rob otic Syst
ems
Chap, 7
Now assu me that the following time s are valid for
spee d micr opro cess or (e.g ., a Mot orol a 68000 with a hypothetical h'
a 12.5-MHz clock' tgh-
rate).
data conv ersio n - 10.0 µs
add time = 1.0 µs
mult iply - 6.0 µs
mem ory access - 0. 5 µs
Ass umi ng that A and 1 - A are prec omp uted
and stor ed in memory th
f or each new com puta tion , one d
ata convers1. 0n, f'1ve mem ory accesses, twen
mul tipli es, and one add will be requ ired to filte
r the data . This corres~ondo
to 25.5 µs or a data rate of 39,215 Hz. Assu min
g a 10 inpu t sensor bas/
one can proc ess thes e data at a rate of less than 4,00
0 Hz per sensor. Furthe;
assu min g a 5-sample/cycle sam plin g rate , one
can hand le signals with fre-
quen cy cont ent up to 784 Hz with this high -spe
ed proc esso r.* It should be
note d that slow er rate proc esso rs wou ld be at leas
t prop ortio nally poorer in
perf orm ance . For exam ple, a Mot orol a 6800 oper
atin g at 1 MHz would be
appr oxim ately 12.5 time s slow er, which wou ld yield
a per sens or date rate of
abo ut 62 Hz. This 62-Hz rate may well be mar gina
l in high spee d applications,
espe ciall y whe n othe r inter ferin g proc esse s, such
as line ar distortions, exist
whic h requ ire digital cont rol syste m com pens atio
n.
. rate ·
• The theor etica l samp lmg leat
1s two samp les per cycle, but m
· pract ·
ice one n eeds to safllP
least five times per cycle.
Real-Time Considerations 531
sec, 7,5
the application , and what may be acceptable in one circumstance may be unac-
table in another.
cep A robot language is usually the medium by which exception handling is ac-
cornplished . For example, one~ the application is programmed, a subroutine can
be added to test for t~e p~rt _bemg prese_nt once the gripper has been commanded
to close. If the part 1s _m1ssm?, the action dictated by the subroutine (signal for
an operator, retry, contmue without the part, etc.) can be carried out.
There are also some errors that are sensed by the system in all cases. For
example, if the manipulator is commanded to move, a check could be made to
ensure that the motion is occurring (e.g., by monitoring the error signals in each
of the joint servos). If no motion occurs, it is possible that the robot arm has
collided with another object or that one or more of the error signals may have
exceeded a predetermined band. In the event that this type of error is detected,
the arm could be stopped and the servo gains reduced so that the arm becomes
"mushy." (This procedure prevents possible damage to motors and mechanical
components of the robot.) It is also possible to monitor the control signal for each
servo when the arm is not moving. In the event that this signal is too large, one
may be able to conclude that the payload is too big and take appropriate action.
In addition to these considerations, there is the concept of self-adaptation to
the environment. For instance, many robots are designed so that they may respond
to a variety of inertial loads. For example, different payload weights should not
affect the path that a robot takes in general. Often this is accomplished by changing
servo gains to compensate for variations in the axes loads that would create un-
desirable deviations from the proper path. This would be equivalent to a young
child picking up a lightweight toy and moving it from point A to point Bin space.
If however that lightweight toy was filled with lead shot, and the child attempted
to follow the original path, difficulty might be experienced in overcoming this
additional inertial load even if more muscle power was employed. That is, the
child might have only limited ability to compensate. Although the computational
algorithm might be there in the child's brain, the ability to handle that level of
load might not exist.
The idea of inertial compensation can be built into the algorithmic control
processes so that when the weight or inertia of the load changes (within limits) ,
the robot may still move the load over the same path if it is instructed to do so.
This idea of self-adaptation can be mathematically modeled and included in the
robot's internal program.
Another situation that can be detected by proper monitoring of position and/
or current sensors (in each joint) occurs when the robot strikes an object that was
previously not known to be there. In this instance, there will be an increased
amount of resistance to the arm's motion, which resuits in an unusual increase in
motor current and/or a large position error. It is possible to program the computer
to sense these conditions and take corrective action, such as stopping the motion
or reducing servo gains. For example, if one visualizes a robot picking up the
532 Computer Considerations for Robotic Systems Chap.
7
lead-weighted toy and moving the load from A to B, _a~d one puts a chair
in the
way , it would be clearly desirable to have the senso~y abiht~ to dete~t that
something
out of the ordinary had occurred and take appro?nate act~on . This self-ad
aptation
concept more or less fits in well with the previous nommal versus extrao
rdinary
discussion.
As more external state sensors (see Chapter 5) are employed with robots
the information they provide will be used to modify the original progra
m in real
time. For example, tactile sensors placed on a robotic gripper provide
real-time
data to the robot's controller, which then commands the gripper's servo
so that
the right amount of force is generated.
Sensors placed in robotic grippers are also important when it is necessary
to
handle objects which have specific stability, rigidity, and orientation requir
ements.
There is no reason to expect that one will always have objects of one type,
and a
truly versatile system should be able to handle a variety of shapes and sizes.
It is clear that as external sensors are more heavily utilized, the information
provided by them will increase the computational burden placed on the
robot's
computer systems. This has already been demonstrated in Chapter 6, where
vision
systems and the computational considerations were discussed.
also provide the user wit~ t~e abi_lity to imbue intelligence in the control program.
In its simplest forms,_ this mtel!t?ence may check binary sensors and change a
location, or make a stmpl~. dec1s1on based on sensory information to handle an
exception. As the capab1hty of the language increases, the intelligence of the
algorithm controlling the rob?t in a specific application can also increase. Thus
corrections based on sensory mputs (such as vision or tactile sensors) are possible
along with communication with other computers and data bases.
Historically, the initial applications of robots were relatively simple and ac-
cordingly, their controllers did not require or provide sophisticated sequence con-
trol. Typically, the following sequence was all that was needed:
The first is a relatively simple method which makes use of a fixed event sequence
in each instruction. The second is based on extensions of programming languages
which add robot-specific functions (or subroutines) to the standard library, or in
which robot-specific commands have been added to the structure of the language.
The third is a language tailored specifically to the programming or training of
robots.
should _be understood th at the pro?ram will ~ait indefinitely until the specified
input hne(s) are ass~rt_ed. Next, If the step 1s a subroutine (another series of
program steps), then It IS exe~uted and the following program step is obtained from
memory (note that the motion and subsequent steps are not performed in this
case). If no subroutine call was indicated, then the robot controller causes the
manipulator to move to a point in space defined by a set of joint variables stored
in memory. Once this location is reached, the remaining actions (for the current
program step) are executed. These include waiting a specified delay time, opening
or closing the gripper, and the final action, which is the setting of the state of the
output lines to a value defined in the programming sequence. Following this , the
next program instruction (step) is fetched from memory and decoded as defined
previously. After all the steps of a particular program are executed, the sequence
repeats from the first step. That is, the controller keeps executing the program
indefinitely.
Due to the nature of the fixed sequence of actions for each program step, it
may be necessary to program additional steps to properly sequence the manipulator.
For example, it is necessary to provide a delay to ensure gripper activation prior
to arm motion. This is due to the fact that it takes a finite time for a gripper to
reach its final state after its activating mechanism receives its control signal. There-
fore, the trainer might want to insert a delay (on the order of a few hundred
milliseconds) prior to the execution of any other manipulator motion. Since the
action sequences of a program step without a subroutine call are check inputs,
perform motion, delay, set gripper state, and set output line states, one easily sees
that it is possible for the next program step to cause a motion (if the input conditions
are satisfied immediately) before the gripper's state has stabilized. To accomplish
a delay prior to the motion of this subsequent step, it is necessary to program an
additional step in which no motion occurs but which makes use of the delay in the
sequence of actions.
While this type of programming may require substantial human activity, it is
still able to produce the desired results (i.e., sequencing a manipulator through a
set of motions). The key to both successful and efficient programming of this type
of controller is knowing the sequence of actions and how to take advantage of
them.
As the complexity of the tasks being performed by robots increased, the
demands for more advanced motion control and decision capability also increased ,
t?ereby requiring more sophisticated programming ~ethods. In so_m e c_ases, the
simple sequencing controls could be expanded by addmg more fu~ct1onahty to_ the
!each pendant by means of multiple levels and ad~ed control sw1tc~es. Besides
increasing the complexity of the teach pendant, this_ approach also increased the
programming time and required skill level of the tramer. . " . ,,
An outgrowth of such complex sequence controlle~s 1s a . menu-dnven .pro-
gramming system that permits the training of the rob?t ~~mg a fixed set of fun~tlons.
The menu system differs from the "fixed instruction sequence control m that
536 Computer Consid erations for Robotic Systems
Chap. 7
statements. One should also observe that there are statements that do not cause
robot motion and the sequence of events is chosen by the programmer or trainer.
Thus it is seen that some of the constraints imposed by the fixed event instruction
are removed.
As the available technology became more sophisticated and manufacturing
requirements grew, the limited flexibility of the language extension approach be-
came obvious. This provided the impetus for the development of robot-specific
languages.
• AL
• AML
• RAIL
538 Computer Considerations for Robotic Systems Chap. 7
• RPL
• VAL
Brief descriptions of each of these are given be]ow. This summary is adapted from
a paper by Gruver et al. (9].
7.6.2.1 AL
AL was the second-generation robot programming language produced at the
Stanford University Artificial Intelligence Laboratory, an early leader in robot
research. Based on concurrent Pascal , it provided constructs for control of multiple
arms in cooperative motion. Commercial arms were integrated into the AL system.
This language has been copied by several research groups around the world. Im-
plementation required a large mainframe computer, but a stand-alone portable
version was marketed for industrial applications. It runs on a PDP 11/45 and is
written almost entirely in OMSI Pascal (9]. In the AL system, programs are
developed and compiled on a PDP-10. The resulting p-code is downloaded into
a PDP-11/45, where it is executed at run time. High-level code is written in SAIL
(Stanford Artificial Intelligence Language). The run-time system is written in
PALX. The PDP 11/45 has a floating-point processor, no cache memory, a single
terminal, and 128 kilobytes of RAM memory. Two PUMA 600's and two Stanford
Scheinman arms were controlled at the same time by this language.
7.6.2.2 AML
A manufacturing language (AML) was designed by IBM to be a well-
structured, semantically powerful interactive language that would be well adapted
to robot programming. The central idea was to provide a powerful base lan-
guage with simple subsets for use by programmers with a wide range of expe-
rience. An interpreter implements the base language and defines the primitive
operations, such as the rules for manipulating vectors and other "aggregate' '
objects that are naturally required to describe robot behavior. A major design
point of the language was that these rules should be as consistent as possible,
with no special-case exceptions. Such a structure provides a ready growth path
as programmers and applications grow more sophisticated. AML is being used
to control the RS/1 assembly robot, a Cartesian arm having linear hydraulic
motors and active force feedback from the end effector. The computer con-
troller on the RS/1 assembly robot consists of an IBM series/1 minicomputer
with a minimum of 192-kilobyte memory. Peripherals include disk and diskette
drive, matrix printer, and keyboard/display terminals. A subset of AML was
employed on the Model 7535 robot that was controlled by the IBM personal
computer. However, the features of this version are not included here since
the 7535 is no longer being marketed by IBM.
sec. 7.6 Robot Programming
539
7.6.2.3 RAIL
7.6.2.4 RPL
,
RPL was developed at SRI International to facilitate development, testing,
and debugging of control algorithms for modest automatic manufacturing systems
that consist of a few manipulators, sensors, and pieces of auxiliary equipment. It
was designed for use by people who are not skilled programmers, such as factory
production engineers or line foremen. RPL may be viewed as LISP cast in a
FORTRAN-lik e syntax.
The SRI Robot Programming System (RPS) consists of a compiler that trans-
lates RPL programs into interpretable code and an interpreter for that code. RPS
is written mostly in Carnegie-Mellon's BLISS-11 and cross-compiles from a DEC
PDP-10 to a PDP-11 or LSI-11. The programs written in this language run under
RT-11 with floppy or hard disks. The RPL language is implemented as subroutine
calls. The user sets up the subroutine library and documents it for people who
must write RPL programs. Previously, SRI operated the Unimate 2000A and
2000B hydraulic arms and the SRI vision module with this language.
7.6.2.5 VAL
VAL is a robot programming language and control system originally designed
for use with Unimation robots. Its stated purpose is to provide the ability to define
robot tasks easily. The intended user of VAL will typically be the manufacturing
engineer responsible for implementing the robot in a desired application.
Eight robot programming languages are compared in Table 7 .6.1. Prior
programming knowledge is helpful but not essential. VAL has the _structure of
BASIC, with many new command words added for _robot p~ogramm~ng. It also
?as its own operating system, called the VAL Momt_or, which_ contams the user
interface editor and file manager. The central momtor contams a DEC LSl-11/
03, or m~re rec~ntly, the LSI-11/23. In a Puma 550 rob?t, each of t~e joints_ is
controlled by a separate 6503 microprocessor. The ~omtor communicates with
the user terminal, the floppy disk, the teach box_, a discrete 1/0 module, and an
optional vision system. VAL is implemented usmg the C language and the 6502
assembly language. It has been released for use with all PUMA robots and with
TABLE 7.6.1 LANGUAGE-COMPARISON TABLE
Language Modalities
X X X X
Textual X X X X
Menu X
Language Type
Subroutines X X
Extension X
New language X X X
X X
Geometric Data
Types
Frame (pose) X X X X X
Joint angles X X X
Vector X X X X
Transformation X X X X
Rotation X X X
Path X
Control Modes
Position X X X X X X X
Guarded moves a a a
Bias force X
Stiffness/compliance X b
Visual servoing C C C C
Conveyor tracking X X
Motion Types
Coordinated joint
between two points X X X X X d
X
Straight line
between two points C
X X X d
X
Splined through
several points X X X X d
X
Continuous path
("tape recorder"
mode)
Implicit geometry X
circles
Implicit geometry
patterns X
Signal Lines
Binary input 0 64 0 242 6 32 32
Binary output 0 64 2 242
Analog input 10 32 32
64 0 0 242
Analog output 0 32 0
4 0 0 0 242 0 64 0
Display·and Specification of Rotations
Rotatio.n matrix 8 h
Angle about a vector X h
Quaternions
Euler angles X X X X
Roll-pitch-yaw X
X
Ability to Control Multiple Arms
Multiple arms X X
X
Control Structures
Statement labels X X X X X
If-then X X X X
X
X X X
If-then-else X X X X
X
X X
While-do X X X X
X
X X
Do-until X X
X
X
X X
540
l
(Continued)
ABI.E 7.6.1
__s---- AL AML
-----
HELP JARS MCL RAIL RPL VAL
X
ease· X
X X X X
for X
X
X
Begin-end X k X
Cobegio-coend . X m
procedure/function/
subroutine X X X X X X X X
successful Sensor Interfaces
X n
Vision X X X
X X X X
force X X
Proximity
Limit switch X X
X X X X X
Support Modules
p
Text editor X 0 0
p X X X
file system X 0 0
X X X
Hot editor X
Interpreter X X X
Compiler X X X X
Simulator X q
X
MACROs X X X
INCLUDE statement X X
Command files X X
Logging of sessions X
Error logging X
Help functions X X
Tutorial dialogue X
Debugging Features
Single stepping X X X X
Breakpoints X X X X
Trace X X X X
Dump X X X X
Source: Reprinted courtesy of the Society of Manufacturing Engineers. Copyright 1983 from the ROBOTS
7113th !SIR Conference Proceedings.
•Using force-control or limit-switch action.
bCurrently being implemented at Jet Propulsion Laboratory.
<Uses visual inputs to determine set points but does not specifically perform visual servoing.
dRelies on the VAL controller.
•Currently being implemented at Stanford University.
'Custom for each system.
' AL displays rotations as a rotation matrix.
hNorrnally, JARS does not display these forms; however, the user may write a routine to print them because
JARS has the forms available internally.
'AL accepts directly the specification of an orientation by three Euler angles (or by an angle about a vector).
1
AL orientations could also be specified by roll- pitch- yaw angles.
:since it is a language based on subroutines added to Pascal, JARS has all the structures of Pascal.
MCL can invoke tasks in parallel using INPAR.
"'HELP permits the simultaneous activation of several tasks.
"Re_ported by the IBM T. J . Watson Research Center, Yorktown Heights, New York; not commercially
avadable.
:~:Rs and HELP use the systems support features of the RT-11 operating system.
q _uses the support features of the PDP-10 operating syStem. .
A simulator ha b d d th IBM T J Watson Research Center, Yorktown Heights, New York.
s een eve1ope at e · · 541
542 Comp uter Consid eration s for Robot ic System s Chap. 7
as three
the Unim ate 2000 and 4000 series. The langua ges descri bed above as well
have been
others , HELP , JARS , and MCL, are compa red in Table 7.6.1 and
adapt ed from Gruve r et al. [9].
The following examp les illustr ate the use of two differ ent robot programmin
. g
langu ages, VAL and one emplo yed on a partic ular Seara- type manip ulator
Robot's Origin _ _ __
(0,0,0)
Deposit Point
(4 block stack shown)
Orig?. -m :t (a """"-- p· k
X -448 xy plane ""-- ICUp
Point
V
• dd ·t
Figure 7 .6.1. hWorksp ace for VAL program mi ng examp1e . The pickup an epos1
• • .
pomts are on t e xy-plane offset (in z) from th e ro bot ,s ongm by -448 mm .
TABLE 7.6.2. UST OF A VAL TERMINAL SESSION
LOAD STACK
. .PROGRAM STACK
.LOCATIONS
OK
.usTP STACK
.PROGRAM STACK
1. REMARK
2. REMARK THIS PROGRAM PICKS UP PARTS FROM A FIXED
3. REMARK LOCATION CALLED PICKUP, THEN DEPOSITS THEM AT A
4. REMARK LOCATION CALLED B. IT IS ASSUMED THAT 4 PARTS
5. REMARK ARE TO BE STACKED ON TOP OF ONE ANOTHER.
6. REMARK
7. OPENI
8. SET B = DEPOSIT
9. SETI COUNT = 0.
10. 10 APPROS PICKUP, 200.00
11. MOVES PICKUP
12. CLOSEI
13. DEPARTS 200.00
14. APPRO B, 200.00
15. MOVES B
16. OPENI
17. DEPARTS 200.00
18. SETI COUNT = COUNT + 1
19. TYPEI COUNT
20. REMARK COUNT INDICATES THE TOTAL NUMBER OF ITEMS STACKED
21 . IF COUNT EO 4 THEN 20
22. REMARK MOVE THE LOCATION OF B UP BY 75.00 MM.
23. SHIFT B BY 0.00, 0.00, 75.00
24. GOTO 10
25. 20 SPEED 50.00 ALWAYS
26. READY
27. TYPE * ** END OF STACK PROGRAM ***
.END
.LISTL
EXEC STACK
COUNT = 1.
COUNT = 2.
COUNT = 3.
COUNT = 4.
*** END OF STACK PROGRAM ***
PROGRAM COMPLETED: STOPPED AT STEP 28
543
544 Computer Considerati ons for Robotic Systems Chap. 7
robot controller, LOAD STACK, tells the system to recall the program and
any location data from the disk. The system response is on the next three
lines , indicating successful completion of this request. The following com-
mand to the controller is LISTP STACK, which tells VAL to list the program
which is called STACK. This particular version also delimits the program
listing by printing .PROGRA M STACK at the beginning and .END at the
end. Two more commands that are used in the table are (1) LISTL, which
commands the controller to print all the locations that the controller knows
about (in this case there are two such locations , DEPOSIT and PICKUP),
and (2) EXEC STACK, which tells the controller to execute the program
called STACK, which is stored in its memory. Following the EXEC com-
mand is the output generated by the program STACK. This output is the
value of the variable COUNT as the program is executed. Note that the
value of COUNT is used to terminate execution of the program when the
desired number of items have been stacked.
Examinatio n of the program listing shows that each line has a number
associated with it (i.e., 1 through 27). These numbers are used to identify
a line so that the program may be edited. VAL has an editor that allows
the user to create programs and store them in the controller. Once stored,
a program may be modified by referring to its line numbers. The modifi-
cations include inserting, deleting, or modifying lines .
The operation of the robot based on the program steps will now be
described.
sition defined by PICKUP is such that when motion ends, the object
will be inside the gripper's jaws.
• Line 12 co~mands the system to close the gripper and wait a sufficient
amount of time for the action to occur. In some cases it may be nec-
essary to add an additional delay if that provided by the command is
insufficient.
• Line 13 tells the manipulator to move along its approach vector in the
direction opposite from which it originally came to a point 200 mm
above the pickup point.
• Line 14 tells the manipulator to move to within 200 mm of point B,
aligning its approach vector downward.
• Line 15 commands the manipulator to move in a straight line until its
tool point is coincident with location B.
• Line 16 tells the gripper to open so that the part can be deposited. This
also includes some time delay for the action to occur. As stated pre-
viously, additional delay may be necessary to compensate for the actual
valves and mechanics used to implement the gripper and to permit the
manipulator to settle to the desired location.
• Line 17 tells the manipulator to move back along the approach vector
so that it is 200 mm above location B.
• Lines 18 and 19 increment the variable COUNT and display its value.
• Line 20 is a comment.
• Line 21 is a test to see if COUNT is equal to 4. If so , go to the statement
with label 20; otherwise, go to the next line.
• Line 22 is a comment.
• Line 23 modifies the location defined by B so that its z coordinate is
increased by 75.0 mm.
• Line 24 forces the program to go to label 10.
• Line 25, which is labeled, tells the controller to reduce the speed of
motions to 50%.
• Line 26 tells the controller to move the manipulator to its ready position,
which is defined as all of the links in a straight line pointing upward.
• Line 27 tells the controller to print a message to the terminal.
From the description of the program, one can easily see the power
implemented by the instructions. Commands exist to cause the manipulator
to move in a straight line and to manipulate position data. (Note that the
"S" in the statement indicates that straight-line motion is desired.) For
example, the variable B, which represents a. loc_ation (i.e.~a_ set of six joint
variables) is modified by a single statement m hne 23. S1m1larly, the com-
546 Computer Considerations for Robotic Systems
Chap. 7
mands APPROS and DEPARTS are quite interesting b~cause they actuan
define positions relative to a variable but do not make 1t necessary for t /
user to define the actual positions for each move that the robot has to mak e
This concept is quite important for robot training, since we have really defin:d
only two positions, PICKUP and B. However,_w~ ~an move to many po-
sitions relative to them. Using this approach, if it 1s necessary to modify
either of the points (PICKUP or B), the changes made to them will auto-
matically be reflected in the intermediate points (selectively by the robot'
path planner), which are defined solely on these two positions. s
been defined previously, Pl and P2*, and that we wish to have the mani pulatvr
move in a straight line starting from Pl and ending at P2.
Table 7 .6.3 shows a listing of the program and comments <.k:fi ni ng the
purpose of the instructions. The program in Table 7 .6.3 takes the diffe re nce
between the initial and terminal points of the line and div ides by the number
of intermediate points plus 1 to compute an incremental distance . It the n
instructs the manipulator to move to the first point, PI OO. After atta in ing
this position, it computes intermediate points by adding P HH to P lOO anu
then instructs the robot to move in a continuous-point fashion connecting the
10 points to form an approximation to a straight line. Note tha t the last
point is P2.
It should be apparent that the robot programming language for the:
MAKER 22 does not contain as high a level of expression as indicated in the
example using VAL. This is obvious if one recognizes that a straight lini: is
achieved with one instruction using VAL whereas it requires the entire pro-
gram in Table 7.6.3 to perform the identical maneuver with the ~laker 22.
However, the same functionality , that is, the ability to move in a straight
line , is provided by both languages.
After reviewing these two examples and the discussion on robot progranuning
languages, it is suggested that Section 2.4 on the function<llity of a robot controUe r
be reviewed in order to relate the desired design functionality to this mate ri,t.l.
To program a servo-contr olled robot, a skilled operator often breaks down the
assigned task into a series of steps so that the manipulato r/tool can be directed
through these steps to complete the task ( a program) . This program is played
back (and may be repeated several times, i.e., it can be used as a subroutine) until
the task cycle is completed. The robot is then ready to repeat the cycle. The
robot's actions may be coordinated with ancillary devices through special sensors
and/or limit switches. These, in conjunction with the controller, send "start work"
signals to, and receive "completio n" signals from other robots or interfacing devices
with which that robot is interacting.
A servo-contr olled robot can be "taught" to follow a program which , once
stored in memory, can be replayed, causing the controller to be instructed to send
power to each joint's motor, which in turn, initiates motion. This teaching process
may require that the operator "demonstra te" points in space by causing the end
effector to move (using one of a number of possible methods) to a series of locations
within the work cell.
The robot can also be taught its assembly tasks from a CAD/CAM data base.
Here, the desired points in space are down loaded from such a data base, rather
than being taught (on the robot) by an operator. This has the advantage of not
occupying the robot for teaching of points and also permits the optimizatio n of the
path using simulation techniques. In addition, it is also likely that within the next
few years artificial intelligence (AI) techniques will permit robot teaching to be
more generalized . For example, AI will allow the robot to place filled bottles in
a case or pallet, without having to be explicitly taught a predetermi ned pattern
and/or having specific points actually demonstrat ed by an operator or down loaded
from a CAD/CAM system. Before discussing this topic, however, we will consider
more standard techniques of demonstrating points to a robot.
There are several methods currently in use. The method employed depends
on the manufactur er's specifications, control system software, and the robot's com-
puting/mem ory capabilities. Teaching typically involves one of the following meth-
ods: continuous path, via points, or programme d points. Each of these is now
briefly discussed.
Path planning or trajectory planning algorithms are concerned with the generation
of the intermediate points along a manipulator's trajectory . These are the points
(or positions) that must be fed to the control system so that the joints can be
commanded to move to the correct locations necessary to position the end effector
properly. In addition, it is often desired to start and stop all robotic axes at the
same time. This behavior is referred to as coordinated motion and will modify
the path-planning algorithm.
In a robot, the initial path position is inferred (from the current position)
Sec. 7.7 Path Planning 551
while the final path position is specified. Along with the final point, some rule
defining the trajectory must be specified and may include the foliowing options:
1. Joints of robot to start and stop at the same time as the end effector moves
from the initial to the final point (not exceeding physical constraints orrobot
specifications). However, the actual path taken is not specified. This is
called joint interpolated motion.
2. The "tool point" is to move along a straight line. This is sometimes referred
to as world motion. Note that this implies that all axes start and stop at the
same time.
3. The tool (or end effector) is to move along a straight line defined by extending
the approach, normal, or orientation vectors associated with the tool point.
This is called tool motion (see Chapter 8 and Section 7.6.4).
4. The end effector may be told to follow a straight line as in world motion,
while the initial and final orientation of the gripper may be required to change.
5. The acceleration or velocity may be specified prior to the motion, or may be
commanded to change during the motion based on some external input.
VAL
MOVES POINT1
allows the programmer to command the robot to move from its current location
to one defined as POINT1 in a straight line. (The motion will be joint interpolated
if the ""s" is deleted.)
Other languages may provide the mathematical capability
to m o v e in a straight line but without
to compute intermediate points necessary
an explicit command.
For our hypothetical system, the following additional specification will also
be included:
Operating System
Vision System
User
Controls
Library of Terminal Application
|High-Level Device Drivers Specific Code
Editor Robot Specific Disk Joystick
Programming Programs Such as
Languagee Terminal
Position Training Bus
Common BUS
Bidirectional High-Speed Communication Link
Joint Joint
Processor #1 Processor #n|
Device Driver for Device Driver for
Common BUS Common BUS
Communication Communication
Device Driver for Binary
Binary 1/0 Modules /O
PID Linear Modules
Algorithm Interpretor
1/O Control Program
Joint Processors
mechanical devices they are controlling and should not present a problem. Also
as noted earlier, the servos are interpolating data and are performing at a faster
than the set point information is being sent from the
updates)
rate (e.g., 1 msec
host (e.g., 16 msec updates)
This scheme is also advantageous in coupling job-specieic
hardware devices
to add vision, it would be a self-
into the system. For example, if we wished
receive position information over the
Contained system and would merely send or
Additionally, if we wanted a
common bus for use by the host when necessary.
to this task could be added, which could
1Orce-controlled gripper, hardware specific
use with a force sensor) and servo
Pertorm such tasks as signal processing (for
host would merely send a signal to the
control of the gripper's actuator. The
555
556 Computer Considerations for Robotic Systems
Chap.7
DATA INTERCHANGE STRUCTURE (DIS)
Joint N Setpoint/Data
Binary Output
Bit Patterm
Binary Input
Bit Pattern
OPTIONAL
Vision Processor
Command
Vision Processor
Data/Response
CRC or Check Digit
T/KH
-T
Every T sec host writes DIS to common bus and all joint processors on bus receive it simutaneously
Each processor computes CRC as a validity check
Each processor interprets global command byte and takes its appropriate data
as a
Any information that the processors are to send to the host are written to the appropriate location (Sui
response to the last command byte)
T/K sec after T the host interprets the information written into the DIS
.The cycle repeats forever
computer to close and exert a force of say 8 oz. Once the processor received the
command, it would perform its task independently of the "central control unit"
and upon completion of its task send back a signal indicating success or the reason
for failure.
To facilitate the "teaching" of points, the system may include another program
by which the trainer uses a joystick, keys, or other control to cause the manipulator
to move either in a joint-by-joint fashion or along a straight line. In either case,
must report their
set points are generated and fed to the servos, which in turn
positions back to the control program. When the trainer is satisfied with the
position of the manipulator, the values of all the variables relating to the position
of the manipulator are saved and given a name such as "this_point."
While the
name "this_point" may be associated with a specific position of the manipulator,
it should be understood that this may be a complex data structure and will carry
information not readily apparent to or needed by the trainer.
Once this type of system has been designed and debugged, it will be used by
people who may be less skilled in the art than its designers. From the user's point
of view, the functionality described in Section 2.4 is implemented primarily by
lan-
high-level programming instructions. For example, using the programming
robot to go in a straight line from its current
guage, the trainer may command the
location to another simply by using a function such as
move straight_to(next_point);
The argument of the function ""next_point" may have been created by the "teachinng
program'" discussed previously.
As pointed out earlier, the function "move_straight_to()" was defined by
the designers, and the details of its exact implementation are probably of little
interest to the user, whose primary interest is to be able to use the function along
with some data to cause the manipulator to perform a specific function. Of course,
other functions must also exist in the library. These instructions provide the ability
to perform the following actions:
7.9 SUMMARY
7.10 PROBLEMS
7.1 Design the software architecture of a robot controller that utilizes a multitasking op
erating system to provide the following:
Service a terminal
.Interpret terminal commands
Perform computations
.Wait for the specific events
joints at set points
-intrusion detection
Execute the instructions ofa robot program
Modify the design so that an editor may run and a program be entered while tne
robot controller is running another program and executing its other defined tasks.
you had to prioritize activities, which would be the most important; the least important
7.2 Pick a specifie processor (such as a 68000 or 8086) and investigate what commercaly
available bus architecture, operating systems, and programming languages are avala
Based on robotic considerations as defined in Chapters 2 and 7, what combinat
provide the most support for the desired functiona ity at the least cost? Which provide
the most flexibility?
4 x
4
7.3 Based on the discussions in Chapter 8, determine the time it takes to multiply two
Sec. 7.11 Further Reading 559
matrices using languages such as BASIC, FORTRAN, Pascal, and C; also determine
the time factor if coded in Assembly language. Use a computer with support for more
than one of these languages to simulate this. Consider fixed-point and floating-point
numbers along with a general matrix multiplication routine and one that is specific to
DH matrices.
the fixed instruction sequence control definition of the 100 Mark 1
7.4 Using MAKER
controller (see Section 7.6.1.1), program the stacking application (which was accom-
plished in VAL (see Table 7.6.2) and described in Section 7.6.3). Assume that there
are eight binary outputs and that straight-line motion is possible. Use mnemonics to
define positions (remember that each position must be taught previously) and other
device
operations such as gripper state and the states of output lines. Since no terminal
is available, use a subset of the binary outputs to indicate which object has been stacked,
and a special one to indicate that the program has completed its task. Comment on
the fixed instruction sequence versus the VAL implementation.
7.5 Obtain descriptions of at least two commercial programming languages from their man-
ufacturers. Use these languages to reprogram the stacking application (see Table 7.6.2
and Section 7.6.3). Compare and comment on the similarity of the functionality of the
different languages (for instance, straight line motion, operations on positions in space,
terminal display, and so on). Are any of the languages better for performing this
for each
object-stacking application? Next consider the programming skill required
skills
language and comment on whether people with explicit computer programming
could easily understand the construction of the language and effectively use it. For the
languages you have chosen, are any applications better suited for implementation by
one versus the others?
7.6 Using a personal computer, program a robot simulator using two-dimensional graphics
and a fixed instruction sequence control scheme. For example, assume that we graph-
ically show the x-y plane of a r-0 manipulator and the annulus of its workspace. Also,
he display will be used to trace the trajectory of the tool tip of our simulated robot.
Moreover, the state of each of the output lines will be shown along with any other
pertinent information.
Define a fixed instruction sequence set (similar to that of the MAKER 100 de-
scribed in Section 7.6.1.1). One posible suggestion is toinput the robot program into
a file which is read by your simulation program. As each instruction is read from the
file, update the display to show the manipulator position, output line state, gripper
state, and so forth. Note that it may be necessary to query for the input states prior
to updating the display.
7.7 Find commercial examples of robotic extensions to computer languages. Comment on
the similarity of the extensions in terms of functionality.
Inc., 1980.
2. Bonner, Susan and Kang G. Shin, "A Comparative Study of Robot Languages," IEEE
Vol. 15, No. 12, 12/82, pp. 82-96.
Computer Society, Computer,