Korn Shell (KSH) Programming
Korn Shell (KSH) Programming
Programming
Tutorial
Philip Brown
1
6.2. A trivial function ......................................................................................................................... 15
6.3. Debugging your functions .......................................................................................................... 15
6.4. CRITICAL ISSUE: exit vs return .................................................................................................... 15
6.5. CRITICAL ISSUE: "scope" for function variables! ........................................................................ 16
6.6. Write Comments! ....................................................................................................................... 17
7. Ksh built-in functions......................................................................................................................... 17
7.1. Read and Set............................................................................................................................... 18
7.2. The test function ........................................................................................................................ 18
7.3. Built-in math............................................................................................................................... 18
8. Redirection and Pipes........................................................................................................................ 19
8.1. Redirection ................................................................................................................................. 19
8.2. Inline redirection ........................................................................................................................ 20
9. Pipes .................................................................................................................................................. 21
9.1. Combining pipes and redirection ............................................................................................... 21
9.2. Indirect redirection (Inline files)................................................................................................. 21
10. Other Stuff....................................................................................................................................... 22
10.1. eval ........................................................................................................................................... 22
10.2. Backticks ................................................................................................................................... 22
10.3. Text positioning/color games................................................................................................... 23
10.4. Number-based menus.............................................................................................................. 23
10.5. Raw TCP access......................................................................................................................... 24
10.6. Graphics and ksh ...................................................................................................................... 24
11. Paranoia, and good programming practices ................................................................................... 24
11.1. Comment your code................................................................................................................. 25
11.2. INDENT! .................................................................................................................................... 25
11.3. Error checking........................................................................................................................... 26
11.4. cron job paranoia ..................................................................................................................... 27
12. Example of script development....................................................................................................... 28
12.1. The good, the bad, and the ugly............................................................................................... 28
12.2. The newbie progammer version .............................................................................................. 28
12.3. The sysadmin-in-training version ............................................................................................. 29
12.4. The Senior Admin version ........................................................................................................ 29
12.5. The Senior Systems Programmer version ................................................................................ 30
13. Summary of positive features ......................................................................................................... 33
2
3
1. Introduction
This is the top level of my "Intro to Korn shell programming" tree. Korn shell is a 'shell-
scripting' language, as well as a user-level login shell. It is also a superset of a POSIX.1
compliant shell, which is great for ensuring portability.
1. Scripting commands tend to be more readable than low-level code. (with the exception
of perl)
2. Scriping languages tend to come with powerful tools attached
3. There is no "compile" phase, so "tweaking" can be done rapidly.
UNIX tends to take #2 to extremes, since it comes standard with "powerful tools" that can be
strung together with pipes or other mechanisms, to get the result you want, with a short
development time. It may not be as efficient as a fully compiled solution, but quite often it can
"get the job done" in a few seconds of run time, compared to 1 second or less for a compiled
program.
A quick scripting solution can be used as a prototype. Then, when you have been using the
prototype happily for a while, and you have evolved the behaviour your end users are happiest
with, you can go back and code a faster, more robust solution in a lower-level language.
That is not to say that scripts cannot be robust! It is possible to do a great deal of error
checking in scripts. Unfortunately, it is not common practice to do so.
You have access to the full range of UNIX utilities, plus some nifty built-in resources.
Generally speaking, UNIX scripting is a matter of using the various command line utilities as
appropriate, with the shell as a means of facilitating communication between each step.
Unfortunately, running all these separate 'external' programs can sometimes result in things
working rather slowly. Which is why ksh has a few more things "built in" to it than the older
'sh'.
4
1.3. Why ksh, not XYZsh for programming?
Bourne shell has been THE "standard" UNIX shellscript language for many years. However,
it does lack some things that would be very useful, at a basic level. Some of those things were
added to C-shell (csh) later on. However, csh is undesirable to use in programming, for
various reasons.
Happily, ksh adds most of the things that people say csh has, but sh does not. So much so, that
ksh became the basis for the "POSIX shell". Which means that all properly POSIX-compliant
systems MUST HAVE something compatible, even though it is now named "sh" like the old
Bourne shell. (For example, /usr/xpg4/bin/sh, is a LINK to /bin/ksh, on solaris!) More
precisely, the behaviour of a POSIX-compliant shell is specified in "IEEE POSIX 1003.2"
BTW: "Korn shell" was written by "David Korn", around 1982 at AT&T labs. You can now
freely download the full source from AT&T if you're not lucky enough to use an OS that
comes with ksh already. It's "open source", even!
2. Ksh preparedness
Howdy, and welcome to the intro to Korn shell programming, AKA POSIX.1 shell.
This is the first part of the larger tutorial. It assumes you want to learn how to really be a
programmer in ksh, as opposed to someone who just quickly throws something together in a
file and stops as soon as it works.
This particular chapter assumes you have done minimal or no sh programming before, so has
a lot more general stuff. Here are the most important things to know and do, before really
getting serious about shellscripting.
You will have to be very comfortable with your choice of text editor, because thats how you
make shellscripts. All examples given should be put into some file. You can then run it with
"ksh file".
5
Or, do the more official way; Put the directions below, exactly as-is, into a file, and follow the
directions in it.
#!/bin/ksh
# the above must always be the first line. But generally, lines
# starting with '#' are comments. They dont do anything.
# This is the only time I will put in the '#!/bin/ksh' bit. But
# EVERY EXAMPLE NEEDS IT, unless you want to run the examples with
# 'ksh filename' every time.
#
# If for some odd reason, you dont have ksh in /bin/ksh, change
# the path above, as appropriate.
#
# Then do 'chmod 0755 name-of-this-file'. After that,
# you will be able to use the filename directly like a command
In shellscripts, a variable can contain a collection of letters and/or numbers [aka a 'string'] , as
well as pure numbers.
#Okay, this script doesnt do anything useful, it is just for demo purposes.
# and normally, I would put in more safety checks, but this is a quickie.
INPUTFILE="$1"
USERLIST="$2"
OUTPUTFILE="$3"
6
count=0
While the script may not be totally readable to you yet, I think you'll agree it is a LOT clearer
than the following;
i=0
while read line ; do
grep $line $2 >> $3
i=$(($i+1))
done <$1
echo $i
echo "$PWD"
prints out your current directory
echo '$PWD'
prints out the string $PWD
echo $PWDplusthis
prints out NOTHING. There is no such variable "PWDplusthis
echo "$PWD"plusthis
prints out your current directory, and the string "plusthis" immediately following it. You
could also accomplish this with the alternate form of using variables,
echo ${PWD}plusthis
There is also what is sometimes called the `back quote`, or ` backtick`: This is not used to
quote things, but actually to evaluate and run commands.
7
3. Ksh basics
This is a quickie page to run through basic "program flow control" commands, if you are
completely new to shell programming. The basic ways to shape a program, are loops, and
conditionals. Conditionals say "run this command, IF some condition is true". Loops say
"repeat these commands" (usually, until some condition is met, and then you stop repeating).
3.1. Conditionals
3.1.1. IF
The final 'fi' is required. This is to allow you to group multiple things together. You can have
multiple things between if and else, or between else and fi, or both.
You can even skip the 'else' altogether, if you dont need an alternate case.
if [ $? -eq 0 ] ; then
print we are okay
print We can do as much as we like here
fi
3.1.2. CASE
The case statement functions like 'switch' in some other languages. Given a particular variable,
jump to a particular set of commands, based on the value of that variable.
While the syntax is similar to C on the surface, there are some major differences;
8
echo got a 'no'
;;
q*|Q*)
#assume the user wants to quit
exit
;;
*)
echo This is the default clause. we are not sure why or
echo what someone would be typing, but we could take
echo action on it here
;;
esac
3.2. Loops
3.2.1. WHILE
The basic loop is the 'while' loop; "while" something is true, keep looping.
There are two ways to stop the loop. The obvious way is when the 'something' is no longer
true. The other way is with a 'break' command.
keeplooping=1;
while [[ $keeplooping -eq 1 ]] ; do
read quitnow
if [[ "$quitnow" = "yes" ]] ; then
keeplooping=0
fi
if [[ "$quitnow" = "q" ]] ; then
break;
fi
done
3.2.2. UNTIL
The other kind of loop in ksh, is 'until'. The difference between them is that 'while' implies
looping while something remains true.
'until', implies looping until something false, becomes true
until [[ $stopnow -eq 1 ]] ; do
echo just run this once
stopnow=1;
echo we should not be here again.
done
3.2.3. FOR
A "for loop", is a "limited loop". It loops a specific number of times, to match a specific
number of items. Once you start the loop, the number of times you will repeat is fixed.
9
for var in one two three ; do
echo $var
done
Whatever name you put in place of 'var', will be updated by each value following "in". So the
above loop will print out
one
two
three
But you can also have variables defining the item list. They will be checked ONLY ONCE,
when you start the loop.
list="one two three"
for var in $list ; do
echo $var
# Note: Changing this does NOT affect the loop items
list="nolist"
done
The two things to note are:
If you used "$list" in the 'for' line, it would print out a SINGLE LINE, "one two three"
4.2. Arrays
Yes, you CAN have arrays in ksh, unlike old bourne shell. The syntax is as follows:
# This is an OPTIONAL way to quickly null out prior values
set -A array
#
array[1]="one"
array[2]="two"
array[3]="three"
three=3
print ${array[1]}
10
print ${array[2]}
print ${array[3]}
print ${array[three]}
To give a default value if and ONLY if a variable is not already set, use this construct:
APP_DIR=${APP_DIR:-/usr/local/bin}
(KSH only)
You can also get funky, by running an actual command to generate the value. For example
DATESTRING=${DATESTRING:-$(date)}
(KSH only)
To count the number of characters contained in a variable string, use ${#varname}.
11
5. Ksh and POSIX utilities
POSIX.1 (or is it POSIX.2?) compliant systems (eg: most current versions of UNIX) come
with certain incredibly useful utilities. The short list is:
cut, join, comm, fmt, grep, egrep, sed, awk
Any of these commands (and many others) can be used within your shellscripts to manupulate
data.
Some of these are programming languages themselves. Sed is fairly complex, and AWK is
actually its own mini-programming language. So I'll just skim over some basic hints and
tricks.
5.1. cut
"cut" is a small, lean version of what most people use awk for. It will "cut" a file up into
columns, and particularly, only the columns you specify. Its drawbacks are:
1. It is picky about argument order. You MUST use the -d argument before the -f
argument
2. It defaults to a tab, SPECIFICALLY, as its delimiter of columns.
The first one is just irritating. The second one is a major drawback, if you want to be flexible
about files. This is the reason why AWK is used more, even for this trivial type of operator:
Awk defaults to letting ANY whitespace define columns.
5.2. join
join is similar to a "database join" command, except it works with files. If you have two files,
both with information sorted by username, you can "join" them in one file, IF and ONLY IF
they are also sorted by that same field. For example
john_s John Smith
in one file, and
john_s 1234 marlebone rd
will be joined to make a single line,
john_s John Smith 1234 marlebone rd
If the files do not already have a common field, you could either use the paste utility to join
the two files, or give each file line numbers before joining them, with
5.3. comm
I think of "comm" as being short for "compare", in a way. But technically, it stands for
"common lines". First, run any two files through "sort". Then you can run 'comm file1 file2' to
tell you which lines are ONLY in file1, or ONLY in file2, or both. Or any combination.
12
For example
5.4. fmt
fmt is a simple command that takes some informational text file, and word-wraps it nicely to
fit within the confines of a fixed-width terminal. Okay, it isn't so useful in shellscripts, but its
cool enough I just wanted to mention it :-)
pr is similarly useful. But where fmt was more oriented towards paragaphs, pr is more
specifically toward page-by-page formatting.
(Note: this is just an example: often, awk is more suitable than grep, for /etc/passwd fiddling)
5.6. sed
Sed actually has multiple uses, but its simplest use is "substitute this string, where you see that
string". The syntax for this is
sed 's/oldstring/newstring/'
This will look at every line of input, and change the FIRST instance of "oldstring" to
"newstring".
If you want it to change EVERY instance on a line, you must use the 'global' modifier at the
end:
sed 's/oldstring/newstring/g'
If you want to substitute either an oldstring or a newstring that has slashes in it, you can use a
different separator character:
sed 's:/old/path:/new/path:'
13
5.7. awk
Awk really deserves its own tutorial, since it is its own mini-language. And, it has one!
But if you dont have time to look through it, the most common use for AWK is to print out
specific columns of a file. You can specify what character separates columns. The default is
'whitespace' (space, or TAB). But the cannonical example, is "How do I print out the first and
fifth columns/fields of the password file?"
The bit between single-quotes is a mini-program that awk interprets. You can tell awk
filename(s), after you tell it what program to run. OR you can use it in a pipe.
You must use single-quotes for the mini-program, to avoid $1 being expanded by the shell
itself. In this case, you want awk to literally see '$1'
If you are interested in learning more about AWK, read my AWK tutorial
6. Ksh Functions
Functions are the key to writing just about ANY program that is longer than a page or so of
text. Other languages may call functions something else. But essentially, its all a matter of
breaking up a large program, into smaller, managable chunks. Ideally, functions are sort of
like 'objects' for program flow. You pick a part of your program that is pretty much self-
contained, and make it into its own 'function'
14
When your program isn't working properly(WHEN, not if), you can then put in little debug
notes to yourself in the approximate section you think is broken. If you suspect a function is
not working, then all you have to verify is
Once you have done that, you then know the entire function is correct, for that particular set
of input(s), and you can look for errors elsewhere.
printmessage() {
echo "Hello, this is the printmessage function"
}
printmessage
The first part, from the first "printmessage()" all the way through the final '}', is the function
definition. It only defines what the function does, when you decide to call it. It does not DO
anything, until you actually say "I want to call this function now".
You call a function in ksh, by pretending it is a regular command, as shown above. Just have
the function name as the first part of your line. Or any other place commands go. For example,
Remember: Just like its own separate shellscript. Which means if you access "$1" in a
function, it is the first argument passed in to the function, not the shellscript.
This same type of modularity can be achived by making separate script files, instead of
functions. In some ways, that is almost preferable, because it is then easier to test each part by
itself. But functions run much faster than separate shellscripts.
A nice way to start a large project is to start with multiple, separate shellscripts, but then
encapsulate them into functions in your main script, once you are happy with how they work.
15
'exit' will exit the entire script, whether it is in a function or not.
'return' will just quit the function. Like 'exit', however, it can return the default "sucess" value
of 0, or any number from 1-255 that you specify. You can then check the return value of a
function, just in the same way you can check the return value of an external program, with the
$? variable.
fatal(){
echo FATAL ERROR
# This will quit the 'fatal' function, and the entire script that
# it is in!
exit
}
lessthanfour(){
if [[ "$1" = "" ]] ; then echo "hey, give me an argument" ; return
1; fi
echo note that the above functions are not even called. They are just
echo defined
16
# You must use a modern sh like /bin/ksh, or /bin/bash for this
subfunc(){
typeset var
echo sub: var starts as $var '(empty)'
var=2
echo sub: var is now $var
}
var=1
echo var starts as $var, before calling function '"subfunc"'
subfunc # calls the function
echo var after function is now $var
Another exception to this is if you call a function in the 'background', or as part of a pipe (like
echo val | function )
This makes the function be called in a separate ksh process, which cannot dynamically share
variables back to the parent shell. Another way that this happens, is if you use backticks to
call the function. This treats the function like an external call, and forks a new shell. This
means the variable from the parent will not be updated. Eg:
func() {
newval=$(($1 + 1))
echo $newval
echo in func: newval ends as $newval
}
newval=1
echo newval in main is $newval
output=`func $newval`
func $newval
echo output is : $output
echo newval finishes in main as $newval
17
See the manpages for 'test' and 'typeset', if you want full info on those beasties.
set $varname
This sets the argument variables $1, $2, etc to be set as if the program were called with
$varname as the argument string to the shellscript. So, if the value of varname is "first second
third", then $1="first", $2="second", and $3="third".
Note that if you want to access "double-digit" arguments, you cannot use "$10". it will get
interpreted as "$1,""0". To access argument #10 and higher you must explicitly define the
limits of the variable string, with braces:
echo ${10}
This is also good to know, if you wish to follow a variable immediately followed by a string.
Compare the output from the following lines:
a="A "
echo $astring
echo ${a}string
Please note that [[]] is a special built-in version of test, that is almost, but not 100%, like the
standard []. The main difference being that wildcard expansion does not work within [[]].
four=$((2 + 2))
eight=$(($four + 4))
print $(($four * $eight))
Warning: Some versions of ksh allow you to use floating point with $(()). Most do NOT.
Also, be wary of assumptions. Being "built in" is not always faster than an external progam.
For example, it is trivial to write a shell-only equivalent of the trivial awk usage, "awk '{print
$2}'", to print the second column. However, compare them on a long file:
18
# function to emulate awk '{print $2}'
sh_awk(){
while read one two three ; do
print $two
done
}
The awk version will be much much faster. This is because ksh scripts are interpreted, each
and every time it executes a line. AWK, however, loads up its programming in one go, and
figures out what it is doing ONE TIME. Once that overhead has been put aside, it then can
repeat its instructions very fast.
ls > /tmp/listing
But bourne-shell derivatives give you even more power than that.
If you know which of the categories your utilities fall into, you can do interesting things.
8.1. Redirection
An uncommon program to use for this example is the "fuser" program under solaris. it gives
you a long listing of what processes are using a particular file. For example:
$ fuser /bin/sh
/bin/sh: 13067tm 21262tm
19
If you wanted to see just the processes using that file, you might initially groan and wonder
how best to parse it with awk or something. However, fuser actually splits up the data for you
already. It puts the stuff you may not care about on stderr, and the meaty 'data' on stdout. So if
you throw away stderr, with the '2>' special redirect, you get
$ fuser /bin/sh 2>/dev/null
13067 21262
Unfortunately, not all programs are that straightforward :-) However, it is good to be aware of
these things, and also of status returns. The 'grep' command actually returns a status based on
whether it found a line. The status of the last command is stored in the '$?' variable. So if all
you care about is, "is 'biggles' in /etc/hosts?" you can do the following:
Additionally, if there are a some fixed lines you want to use, and you do not want to bother
making a temporary file, you can pretend part of your script is a separate file!. This is done
with the special '<<' redirect operator.
EOF is the traditional string. But you can actually use any unique string you want.
Additionally, you can use variable expansion in this section!
DATE=`date`
HOST=`uname -n`
mailx -s 'long warning' root << EOF
20
Something went horribly wrong with system $HOST
at $DATE
EOF
9. Pipes
In case you missed it before, pipes take the output of one command, and put it on the input of
another command. You can actually string these together, as seen here;
grep hostspec /etc/hosts| awk '{print $1}' | fgrep '^10.1.' | wc -l
This is a fairly easy way to find what entries in /etc/hosts both match a particular pattern in
their name, AND have a particular IP address ranage.
The "disadvantage" to this, is that it is very wasteful. Whenever you use more than one pipe at
a time, you should wonder if there is a better way to do it. And indeed for this case, there most
certainly IS a better way:
There is actually a way to do this with a single awk command. But this is not a lesson on how
to use AWK!
21
wc will report that it saw two files, "/dev/fd/4", and "/dev/fd/5", and each "file" had 1 line
each. From its own perspective, wc was called simply as
wc -l /dev/fd/4 /dev/fd/5
eval
Backticks
Text positioning/color/curses stuff
Number-based menus
Raw TCP access
Graphics and ksh
10.1. eval
The eval command is a way to pretend you type something directly. This is a very dangerous
command. Think carefully before using it.
One way of using eval, is to use an external command to set variables that you do not know
the name of beforehand. Or a GROUP of variables. A common use of this, is to set terminal-
size variables on login:
eval `resize`
10.2. Backticks
There are ways to put the output of one command as the command line of another one. There
are two methods of doing this that are basically equivalent:
echo This is the uptime: `uptime`
echo This is the uptime: $(uptime)
Technically, the second one is the POSIX-preferred one.
In addition to creating dynamic output, this is also very useful for setting variables:
datestring=`date`
echo "The current date is: $datestring"
22
10.3. Text positioning/color games
This is actually a huge topic, and almost deserves its own tutorial. But I'm just going to
mention it briefly.
Some people may be familiar with the "curses" library. It is a way to manipulate and move
around text on a screen, reguardless of what kind of "terminal" the user is using.
As mentioned, this is a potentially huge topic. So, I'm just going to give you a trivial example,
and say "Go read the man-page on tput". Well, okay, actually, you have to read the "tput"
manpage, AND either the "terminfo" or "termcap" manpage to figure out what magical 3-5
letter name to use. For example, it should tell you that "cup" is short for the "cursor_address"
command. But you must use "cup", NOT "cursor_address", with tput.
tput init
tput clear
tput cup 3 2
print -n "Here is a clean screen, with these words near the top"
endline=`tput cols`
tput cup $(($endline - 2))
print "and now, back to you"
sleep 2
The above example clear the screen, prints the given line at a SPECIFIC place on the screen,
then puts the cursor back down near the bottom of the screen for you.
PS: If you've been doing a lot of funky things with the screen, you might want to do a
tput reset
as the last thing before your shellscript exits.
23
Note that this will loop between "do ... done" until you trigger a break somehow! (or until the
user control-c's or whatever). So dont forget an exit condition!
Here is a trivial example that just opens up a connection to an SMTP server. Note that the
connection is half-duplex: You do NOT see data that you send to the other side.
#!/bin/ksh -p
MAILHOST=127.0.0.1
exec 3<>/dev/tcp/$MAILHOST/25 || exit 1
Note that we use the "-r" flag to read. In this particular example, it is not neccessary. But in
the general case, it will give you the data "raw". Be warned that if the shell cannot open the
port, it will kill your entire script, with status 1, automatically
You can also dump the rest of the data waiting on the socket, to whereever you like, by doing
24
11.1. Comment your code.
You really should at MINIMUM have some comment about every page (that's every 24 lines).
Ideally, you should always comment all your functions. One-line functions can probably stand
by themselves, but otherwise, a quick single line comment is a good thing to have for small
functions.
For longer functions, you should really use formal comment spec. Something like
# Function xyz
# Usage: xyz arg1 arg2 arg3 arg4
# arg1 is the device
# arg2 is a file
# arg3 is how badly you want to mangle it
# arg4 is an optional output file
# Result: the file will be transferred to the device, with the
# appropriate ioctls. Any err messages will be saved in the output
# file, or to stderr otherwise
xyz(){
...
}
Note that shellscripts are themselves one large "function". So dont forget basic comments on
your shellscript's functionality at the top of the file!
11.2. INDENT!
Every time you start a function, indent.
This makes it easier to see at a glance what level you are at.
# top level
print this is the top
somefunction(){
# indent in functions
print we are in the function now
25
print leaving somefunction now
}
# And now we can clearly see that all this stuff is outside any function.
# This makes it easier to find the "main line" of the script
print original shellscript args are $0
print lets try somefunction
somefunction heres some args
exit 1
# (but it would be nice to print out SOME message before exiting!)
Nice programs will notice that your script exited with a non-zero status. [Remember, the
status of the last command is in '$?']
Ideally, they will complain.
On the other hand, sometimes your own scripts are the ones that are doing the calling!
In that type of situation, it may be suitable to have a top-level script that keeps an eye on
things. A simple example is:
fatal(){
# Something went horribly wrong.
# print out an errormessage if provided, then exit with an
# "error" status
check_on_modems
if [[ $? -ne 0 ]] ; then fatal modems ; fi
check_on_network
if [[ $? -ne 0 ]] ; then fatal network ; fi
check_on_servers
if [[ $? -ne 0 ]] ; then fatal servers ; fi
26
Note that even my paranoid 'fatal' function, IS PARANOID!
Normally, it is assumed you will call it with "fatal what-failed". But if you somehow dont, it
notices, and provides a default.
Sometimes, making the assumption that $1 contains valid data can completely screw up the
rest of your function or script. So if it is an important function, assume nothing!
This is particularly true of CGI scripts. [Gasp]. Yes, Virginia, it IS possible to write CGI in
something other than perl.
cron by default saves anything that gets send to 'stderr', and MAILS IT to the owner of the
cron job. So, sometimes, if you just want a minor error logged somewhere, it is sufficient to
just do
If you do not regularly read email for the user in question, you can either set up an alias for
that user, to forward all its email to you, or do
export MAILTO=my@address.here
The MAILTO trick does not work on all cron demons, however.
27
12. Example of script development
12.1. The good, the bad, and the ugly
Hopefully, you have read through all the other chapters by this point. This page will show you
the "big picture" of shellscript writing.
Here are four versions of essentially the same program: a wrapper to edit a file under SCCS
version control.
The basic task is to use the sccs command to "check out" a file under version control, and then
automatically edit the file. The script will then be used by "users", aka coders, who may not
be particularly advanced UNIX users. Hence, the need for a wrapper script.
While the basic functionality is the same across all versions , the differences in safety and
usability between the first version and the last version are staggering.
The first one is extremely bad: it would be written by someone who has just picked up a book
on shellscripting, and has decided, "I'm a programmer now".
The second one is an improvement, showing some consideration to potential users by having
safety checks.
The third one is a good, solid version. It's a positive role model for your scripts.
The final one is a full-blown, paranoid, commented program unto itself, with whiz-bang
features. This is the way a professional programmer would write it. Don't let that put you off:
it's the way you can and should write it too! You might start with a version like the initial
dumb one, as the initial step in your code development, just to make sure you have the basic
functionality for the task. But after it is shown to work, you should upgrade it to a more
reasonable one immediately.
Note: there is a summary of good practices show in this page, at the bottom.
sccs edit $1
if [ "$EDITOR" = "" ] ; then
EDITOR=vi
fi
$EDITOR $1
This version makes somewhat of an attempt to be user friendly, by having a check for a user-
specified EDITOR setting, and using it if available. However, there are no comments, no error
checking, and no help for the user whatsoever!
28
12.3. The sysadmin-in-training version
#!/bin/ksh
if [ $# -lt 1 ] ; then
print "This program will check out a file, or files,
with sccs"
exit 1
fi
sccs edit $@
$EDITOR $@
This is somewhat of a step above the prior version. It accepts multiple files as potential
arguments. It's always nice to be flexible about the number of files your scripts can handle. It
also has a usage message, if the script is called without arguments. Plus, it's always a good
idea to put your name in in, unless you're working for the company of "Me, Myself and I,
Inc."
Unfortunately, there is still quite a bit lacking, as you can tell by comparing it to the next
version.
usage(){
print sedit - a wrapper to edit files under SCCS
print "usage: sedit file {file2 ...}"
}
29
PATH=$SCCSBIN:$PATH
if [ $# -lt 1 ] ; then
usage
print ERROR: no files specified
exit 1
fi
# Yes, I could use "sccs edit $@" and check for a single
error, but this
# approach allows for finer error reporting
for f in $@ ; do
sccs edit $f
if [ $? -ne 0 ] ; then
print ERROR checking out file $f
if [ "$filelist" != "" ] ; then
print "Have checked out $filelist"
fi
exit 1
fi
filelist="$filelist $f"
done
$EDITOR $filelist
if [ $? -eq 0 ] ; then
print ERROR: $EDITOR returned error status
exit 1
fi
This guy has been around the block a few times. He's a responsible sysadmin, who likes to be
disaster-prepared. In this case, the most likely "disaster" is 100 calls from developers asking
"Why doesnt it work for me?" So when things break, it's a good idea to provide as much
information as possible to the user.
Compare and contrast the first version of the program, to this one. Then try to make your own
scripts be more like this!
30
# Usage: see usage() function, below
usage(){
print sedit - a wrapper to edit files under SCCS
print "Usage: sedit [-c|-C] [-f] file {file2 ...}"
print " -c check in file(s) after edit is
complete"
print " -C check in all files with single
revision message"
print " -f ignore errors in checkout"
}
if [ $# -lt 1 ] ; then
usage
print ERROR: no files specified
exit 1
fi
# Yes, I could use "sccs edit $@" and check for a single
error, but this
# approach allows for finer error reporting.
31
# "$@" is a special construct that catches spaces in
filenames.
# Note that "$*" is NOT 100% the same thing.
for f in "$@" ; do
sccs edit "$f"
if [ $? -ne 0 ] ; then
print ERROR checking out file $f
if [ "$force" = "" ] ; then
if [ "$filelist" != "" ] ; then
print "Have checked out
$filelist"
fi
exit 1
fi
# else, -f is in effect. Keep going
fi
filelist="$filelist $f"
done
This guy has been around the block a few times. Heck, he helped BUILD the block ;-)
This was originally my third and final version. It's the way I would really write the script. But
I decided it might be a bit daunting to new scripting folks, so I made a new intermediate third
step, above.
32
Provides optional extra functionality, where it makes sense. (added -c, -C, and -f
option flags.) This shows understanding of writing scripts, AND understanding of the
area of the task (SCCS version control)
Use of the 'getopts' standard util, rather than hand-coding a custom argument parser
33