UNIX Shell Scripting With Ksh-Bash
UNIX Shell Scripting With Ksh-Bash
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
1 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
Table of Contents 1. What is a shell script 2. Why use shell scripts 3. History 4. Feature comparison 5. Other scripting languages 6. ksh/bash vs sh 7. Basics 8. Filename Wildcards 9. Variables 10. Preset Variables 11. Arguments 12. Shell options 13. Command substitution 14. I/O redirection and pipelines 15. Input and output 16. Conditional Tests 17. Conditional Tests (contd.) 18. Flow control 19. Flow control (contd.) 20. Conditional test examples 21. Miscellaneous 22. Manipulating Variables 23. Functions 24. Advanced I/O 25. Wizard I/O 26. Coprocesses 27. Signals 28. Security 29. Style 30. Examples 31. Common external commands 32. References
2 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
(1)
The shell itself has limited capabilities -- the power comes from using it as a "glue" language to combine the standard Unix utilities, and custom software, to produce a tool more useful than the component parts alone. Any shell can be used for writing a shell script. To allow for this, the first line of every script is: #!/path/to/shell (e.g. #!/bin/ksh).
The #! characters tell the system to locate the following pathname, start it up and feed it the rest of the file as input. Any program which can read commands from a file can be started up this way, as long as it recognizes the # comment convention. The program is started, and then the script file is given to it as an argument. Because of this, the script must be readable as well as executable. Examples are perl, awk, tcl and python.
If the file is made executable using chmod, it becomes a new command and available for use (subject to the usual $PATH search).
chmod +x myscript
A shell script can be as simple as a sequence of commands that you type regularly. By putting them into a script, you reduce them to a single command. Example: ex0 display, text
1: 2: 3: 4: #!/bin/sh date pwd du -k
3 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
(2)
Create new commands using combinations of utilities in ways the original authors never thought of. Simple shell scripts might be written as shell aliases, but the script can be made available to all users and all processes. Shell aliases apply only to the current shell. Wrap programs over which you have no control inside an environment that you can control.
e.g. set environment variables, switch to a special directory, create or select a configuration file, redirect output, log usage, and then run the program.
Create customized datasets on the fly, and call applications (e.g. matlab, sas, idl, gnuplot) to work on them, or create customized application commands/procedures. Rapid prototyping (but avoid letting prototypes become production)
Typical uses
System boot scripts (/etc/init.d) System administrators, for automating many aspects of computer maintenance, user account creation etc. Application package installation tools
Other tools may create fancier installers (e.g. tcl/tk), but can not be assumed to be installed already. Shell scripts are used because they are very portable. Some software comes with a complete installation of the tool it wants to use (tcl/tk/python) in order to be self contained, but this leads to software bloat.
Application startup scripts, especially unattended applications (e.g. started from cron or at) Any user needing to automate the process of setting up and running commercial applications, or their own code. AUTOMATE, AUTOMATE, AUTOMATE
4 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
(3)
History of Shells
sh
aka "Bourne" shell, written by Steve Bourne at AT&T Bell Labs for Unix V7 (1979). Small, simple, and (originally) very few internal commands, so it called external programs for even the simplest of tasks. It is always available on everything that looks vaguely like Unix.
csh
The "C" shell. (Bill Joy, at Berkeley). Many things in common with the Bourne shell, but many enhancements to improve interactive use. The internal commands used only in scripts are very different from "sh", and similar (by design) to the "C" language syntax.
tcsh
The "TC" shell. Freely available and based on "csh". It has many additional features to make interactive use more convenient.
We use it as the default interactive shell for new accounts on all of our public systems. Not many people write scripts in [t]csh. See Csh Programming Considered Harmful by Tom Christiansen for a discussion of problems with programming csh scripts. ksh
The "Korn" shell, written by David Korn of AT&T Bell Labs (now AT&T Research). Written as a major upgrade to "sh" and backwards compatible with it, but has many internal commands for the most frequently used functions. It also incorporates many of the features from tcsh which enhance interactive use (command line history recall etc.).
It was slow to gain acceptance because earlier versions were encumbered by AT&T licensing. This shell is now freely available on all systems, but sometimes not installed by default on "free" Unix. There are two major versions. ksh88 was the version incorporated into AT&T SVR4 Unix, and may still be installed by some of the commercial Unix vendors. ksh93 added more features, primarily for programming, and better POSIX compliance.
POSIX 1003.2 Shell Standard. Standards committees worked over the Bourne shell and added many features of the Korn shell (ksh88) and C shell to define a standard set of features which all compliant shells must have.
On most systems, /bin/sh is now a POSIX compliant shell. Korn shell and Bash are POSIX compliant, but have many features which go beyond the standard. On Solaris, the POSIX/XPG4 commands which differ slightly in behaviour from traditional SunOS commands are located in /usr/xpg4/bin bash
The "Bourne again" shell. Written as part of the GNU/Linux Open Source effort, and the default shell for Linux and Mac OS-X. It is a functional clone of sh, with additional features to enhance interactive use, add POSIX compliance, and partial ksh compatability.
zsh
A freeware functional clone of sh, with parts of ksh, bash and full POSIX compliance, and many new interactive command-line editing features. It was installed as the default shell on early MacOSX systems.
5 de 56 13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
6 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
(4)
Pass expanded command line arguments to programs; get exit status back. Pass environment variables to programs. Expand filename wildcards using []*?. Each shell has some additional wildcard metacharacters, but these are common to all shells. Standard I/O redirection and piping with <,>,>>,| A few internal functions (cd) Backgrounding commands with & Quoting rules: "double quotes" protect most things, but allow $var interpretation; 'single quotes' protect all metacharacters from interpretation. Home directory expansion using ~user (except for sh) comments Command substitution using `command` (backtics) Expand variables using $varname syntax Conditional execution using && and || Line continuation with "\"
#
Principal Differences
between sh (+derivitives), and csh (+derivitives). Syntax of all the flow control constructs and conditional tests. Syntax for string manipulation inside of scripts Syntax for arithmetic manipulation inside of scripts Syntax for setting local variables (used only in the script) and environment variables (which are passed to child processes). setenv vs export Syntax for redirecting I/O streams other than stdin/stdout Login startup files (.cshrc and .login, vs .profile) and default options Reading other shell scripts into the current shell (source filename, vs . filename) Handling of signals (interrupts)
7 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
(5)
8 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
(6)
ksh/bash vs sh
Ksh and bash are both supersets of sh. For maximum portability, even to very old computers, you should stick to the commands found in sh. Where possible, ksh or bash-specific features will be noted in the following pages. In general, the newer shells run a little faster and scripts are often more readable because logic can be expressed more cleanly user the newer syntax. Many commands and conditional tests are now internal.
The philosophy of separate Unix tools each performing a single operation was followed closely by the designers of the original shell, so it had very few internal commands and used external tools for very trivial operations (like echo and [). Ksh and bash internally performs many of the basic string and numeric manipulations and conditional tests. Occasional problems arise because the internal versions of some commands like echo are not fully compatible with the external utility they replaced. The action taken every time a shell needs to run an external program is to locate the program (via $PATH), fork(), which creates a second copy of the shell, adjust the standard input/output for the external program, and exec(), which replaces the second shell with the external program. This process is computationally expensive (relatively), so when the script does something trivial many times over in a loop, it saves a lot of time if the function is handled internally.
If you follow textbooks on Bourne shell programming, all of the advice should apply no matter which of the Bourne-derived shells you use. Unfortunately, many vendors have added features over the years and achieving complete portability can be a challenge. Explicitly writing for ksh (or bash) and insisting on that shell being installed, can often be simpler. The sh and ksh man pages use the term special command for the internal commands - handled by the shell itself.
9 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
(7)
as the first non-whitespace character on a line flags the line as a comment, and the rest of the line is completely ignored. Use comments liberally in your scripts, as in all other forms of programming. as the last character on a line causes the following line to be logically joined before interpretation. This allows single very long commands to be entered in the script in a more readable fashion. You can continue the line as many times as needed.
This is actually just a particular instance of \ being to escape, or remove the special meaning from, the following character.
as a separator between words on a line is interpreted as a newline. It allows you to put multiple commands on a single line. There are few occasions when you must do this, but often it is used to improve the layout of compound commands.
Exit status
Every command (program) has a value or exit status which it returns to the calling program. This is separate from any output generated. The exit status of a shell script can be explicitly set using exit N, or it defaults to the value of the last command run.
10 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
The exit status is an integer 0-255. Conventionally 0=success and any other value indicates a problem. Think of it as only one way for everything to work, but many possible ways to fail. If the command was terminated by a signal, the value is 128 plus the signal value.
11 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
(8)
Filename Wildcards
The following characters are interpreted by the shell as filename wildcards, and any word containing them is replaced by a sorted list of all the matching files.
Wildcards may be used in the directory parts of a pathname as well as the filename part. If no files match the wildcard, it is left unchanged. Wildcards are not full regular expressions. Sed, grep, awk etc. work with more flexible (and more complex) string matching operators. *
Match any single character from the bracketed set. A range of characters can be specified with [ - ]
[!...]
Match any single character NOT in the bracketed set. An initial "." in a filename does not match a wildcard unless explicitly given in the pattern. In this sense filenames starting with "." are hidden. A "." elsewhere in the filename is not special. Pattern operators can be combined Example: could match chapter1.tex, chapter4.tex, chapter5.tex.old. It would not match chapter10.tex or chapter1
chapter[1-5].*
12 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
(9)
Shell Variables
Scripts are not very useful if all the commands and options and filenames are explicitly coded. By using variables, you can make a script generic and apply it to different situations. Variable names consist of letters, numbers and underscores ([a-zA-Z0-9_], cannot start with a number, and are case sensitive. Several special variables (always uppercase names) are used by the system -- resetting these may cause unexpected behaviour. Some special variables may be read-only. Using lowercase names for your own variables is safest.
Creates (if it didn't exist) a variable named "srcfile" and sets it to the value "dataset1". If the variable already existed, it is overwritten. Variables are treated as text strings, unless the context implies a numeric interpretation. You can make a variable always be treated as a number. Note there must be no spaces around the "=".
set
Give the variable a null value, (not the same as removing it).
export srcfile
Added srcfile to the list of variables which will be made available to external program through the environment. If you don't do this, the variable is local to this shell instance.
export
List all the variables currently being exported - this is the environment which will be passed to external programs.
Using variables
$srcfile
Prefacing the variable name with $ causes the value of the variable to be substituted in place of the name.
${srcfile}
If the variable is not surrounded by whitespace (or other characters that can't be in a name), the name must be surrounded by "{}" braces so that the shell knows what characters you intend to be part of the name. Example:
datafile=census2000 # Tries to find $datafile_part1, which doesn't exist echo $datafile_part1.sas # This is what we intended echo ${datafile}_part1.sas
13 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
Conditional modifiers
There are various ways to conditionally use a variable in a command.
${datafile-default}
Substitute the value of $datafile, if it has been defined, otherwise use the string "default". This is an easy way to allow for optional variables, and have sensible defaults if they haven't been set. If datafile was undefined, it remains so.
${datafile=default}
Similar to the above, except if datafile has not been defined, set it to the string "default".
${datafile+default}
If variable datafile has been defined, use the string "default", otherwise use null. In this case the actual value $datafile is not used.
${datafile?"error message"}
Substitute the value of $datafile, if it has been defined, otherwise display datafile: error message. This is used for diagnostics when a variable should have been set and there is no sensible default value to use.
Placing a colon (:) before the operator character in these constructs has the effect of counting a null value the same as an undefined variable. Variables may be given a null value by setting them to an empty string, e.g. datafile= . Example: echo ${datafile:-mydata.dat} Echo the value of variable datafile if it has been set and is non-null, otherwise echo "mydata.dat".
14 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
(10)
Login environment
$USER, $LOGNAME
The list of directories that will be searched for external commands. You can change this in a script to make sure you get the programs you intend, and don't accidentally get other versions which might have been installed.
$TERM
The terminal type in which the shell session is currently executing. Usually "xterm" or "vt100". Many programs need to know this to figure out what special character sequences to send to achieve special effects.
$PAGER
If set, this contains the name of the program which the user prefers to use for text file viewing. Usually set to "more" or "less" or something similar. Many programs which need to present multipage information to the user will respect this setting (e.g. man). This isn't actually used by the shell itself, but shell scripts should honour it if they need to page output to the user.
$EDITOR
If set, this contains the name of the program which the user prefers to use for text file editing. A program which needs to have the user manually edit a file might choose to start up this program instead of some built-in default (e.g. "crontab -e". This also determines the default command-line-editing behaviour in interactive shells.
The previous directory (before the most recent cd command). However, changing directories in a script is often dangerous. $? (readonly) Set to the exit status of the last command run, so you can test success or failure. Every command resets this so it must be saved immediately if you want to use it later.
$-
15 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
Internal Field Separators: the set of characters (normally space and tab) which are used to parse a command line into separate arguments. This may be set by the user for special purposes, but things get very confusing if it isn't changed back.
Process ID variables
(readonly) Set to the process ID of the current shell - useful in making unique temporary files, e.g. /tmp/$0.$$ $PPID (readonly) Set to the process ID of the parent process of this shell - useful for discovering how the script was called. $! (readonly) Set to the process ID of the last command started in background - useful for checking on background processes.
$$
Integer number of seconds since this shell was started. Can be used for timing commands.
$RANDOM
Every time it is valuated, $RANDOM returns a random integer in the range 0-32k. RANDOM may be set to "seed" the random number generator. $LINENO (readonly) Always evaluates to the current line number of the script being executed - useful for debugging.
16 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
(11)
The shell expands wildcards and makes variable and command substitutions as normal, then parses the resulting words by whitespace (actually special variable $IFS), and places the resulting text strings into the positional variables as follows:
$0, $1, $2, ... $9
The first 9 arguments are made available directly as $1-$9. To access more than 9, use shift, or $*, $@. The variable $0 contains the name of the script itself.
${10}, ${11}, ...
Positional arguments greater than 9 are set by ksh and bash. Remember to use braces to refer to them.
shift
discard $1 and renumber all the other variables. "shift N" will shift N arguments at once.
$#
contains the number of arguments that were set (not including $0).
$*
contains all of the arguments in a single string, with one space separating them.
$@
similar to $*, but if used in quotes, it effectively quotes each argument and keeps them separate. If any argument contains whitespace, the distinction is important. e.g. if the argument list is: a1 a2 "a3 which contains spaces" a4 then: $1=a1, $2=a2, $3=a3 which contains spaces, $4=a4 and: $*=a1 a2 a3 which contains spaces a4 and: "$@"="a1" "a2" "a3 which contains spaces" "a4" Only using the form "$@" preserves quoted arguments. If the arguments are being passed from the script directly to some other program, it may make a big difference to the meaning. Example: ex7 display, text
1: 2: 3: 4: 5: 6: 7: 8: 9: 10: 11: 12: 13: 14: 15: #!/bin/sh # # Check positional argument handling echo "Number of arguments: $#" echo "\$0 = $0" echo "Loop over \$*" for a in $*; do echo \"$a\" done echo "Loop over \"\$@\"" for a in "$@"; do echo \"$a\" done
17 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
Example: pickrandom display, text Selects a random file from a directory. Uses the ksh RANDOM feature.
1: 2: 3: 4: 5: 6: 7: 8: 9: 10: 11: 12: 13: 14: 15: 16: 17: 18: 19: 20: #!/bin/ksh # Select a random image from the background logo collection # This could be used to configure a screen saver, for example. # # This works even if the filenames contain spaces. # switch to the logos directory to avoid long paths logos=/afs/northstar/common/usr/lib/X11/logos/backgrounds cd $logos # '*' is a filename wildcard to match all files in the current directory set * # Use the syntax for arithmetic expressions. "%" is the modulo operator # Shift arguments by a random number between 0 and the number of files shift $(($RANDOM % $#)) # Output the resulting first argument echo "$logos/$1"
18 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
(12)
Shell options
Startup options. ksh -options scriptname -x echo line to stderr before executing it -n read commands and check for syntax errors, but do not execute. -a all variables are automatically exported -f disable wildcard filename expansion (globbing)
set -x
contains the currently set option letters There are many other options, not often needed. Options in ksh and bash can also be set using long names (e.g. -o noglob instead of -f). Many options are unique to ksh or bash.
19 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
(13)
Command Substitution
sh syntax
`command`
A command (plus optional arguments) enclosed in backticks is executed and the standard output of that command is substituted. If the command produces multiline output, the newlines are retained. If the resultant string is displayed, unquoted, using echo, newlines and multiple spaces will be removed.
ksh/bash syntax
$(command)
This syntax is functionally the same as backticks, but commands can be more easily nested.
$(<file)
This is equivalent to `cat file`, but implemented internally for efficiency. Example: ex3 display, text
1: 2: 3: 4: 5: 6: 7: 8: 9: #!/bin/ksh echo Today is `date` file=/etc/hosts echo The file $file has $(wc -l < $file) lines hostname -s > myhostname echo This system has host name $(<myhostname)
20 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
(14)
Output redirection
> filename
Standard ouput (file descriptor 1) is redirected to the named file. The file is overwritten unless the noclobber option is set. The file is created if it does not exist.
The special device file /dev/null can be used to explicitly discard unwanted output. Reading from /dev/null results in an End of File status. >> filename
Standard ouput is appended to the named file. The file is created if it does not exist.
>| filename
Input redirection
< filename
Standard input (file descriptor 0) is redirected to the named file. The file must already exist.
Command pipelines
command | command [ | command ...]
Pipe multiple commands together. The standard output of the first command becomes the standard input of the second command. All commands run simultaneously, and data transfer happens via memory buffers. This is one of the most powerful constructs in Unix. Compound commands may also be used with pipes. Pipes play very nicely with multiprocessor systems.
No more than one command in a pipeline should be interactive (attempt to read from the terminal). This construct is much more efficient than using temporary files, and most standard Unix utilities are designed such that they work well in pipelines. The exit status of a pipeline is the exit status of the last command. In compound commands, a pipeline can be used anywhere a simple command could be used.
21 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
(15)
Script output
echo
Print arguments, separated by spaces, and terminated by a newline, to stdout. Use quotes to preserve spacing. Echo also understands C-like escape conventions.
Beware that the shell may process backslashes before echo sees them (may need to double backslash). Internal in most shells, but was originally external.
\b \f \r \v
\c \n \t \\
\0n where n is the 8-bit character whose ASCII code is the 1-, 2- or 3-digit octal number representing that character.
-n suppress newline print (ksh internal) Print arguments, separated by spaces, and terminated by a newline, to stdout. Print observes the same escape conventions as echo. -n suppress newline -r raw mode - ignore \-escape conventions -R raw mode - ignore \-escape conventions and -options except -n.
Script input
read var1 var2 rest
read a line from stdin, parsing by $IFS, and placing the words into the named variables. Any left over words all go into the last variable. A '\' as the last character on a line removes significance of the newline, and input continues with the following line. -r raw mode - ignore \-escape conventions Example: ex4a display, text
1: 2: 3: 4: 5: #!/bin/sh echo "Testing interactive user input: enter some keystrokes and press return" read x more echo "First word was \"$x\"" echo "Rest of the line (if any) was \"$more\""
22 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
23 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
(16)
File tests
-e file
True if the open filedescriptor is associated with a terminal device. E.g. this is used to determine if standard output has been redirected to a file.
This is rarely needed now, but is still often found. $variable = text
True if $variable comes before (lexically) text Similarly, > = comes after
24 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
(17)
True if $variable, interpreted as a number, is not equal to number. Similarly, -lt = less than, -le = less than or equal, -gt = greater than, -ge = greater than or equal
True if $variable matches pattern. If pattern contains no wildcards, then this is just an exact text match. The same wildcards as used for filename matching are used.
The pattern must not be quoted. Since [[...]] is internal to the shell, the pattern in this case is treated differently and not filename-expanded as an external command would require. file1 -nt file2
true if file1 is effectively the same as file2, after following symlinks and hard links.
AND and OR syntax for [[ ... ]] Parentheses may be inserted to resolve ambiguities or override the default operator precedence rules. Examples:
if [[ fi pwent=`grep '^richard:' /etc/passwd` if [ -z "$pwent" ]; then echo richard not found -x /usr/local/bin/lserve && \ -w /var/logs/lserve.log ]]; then /usr/local/bin/lserve >> /var/logs/lserve.log &
25 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
26 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
(18)
Execute the first list. If true (success), execute the second one.
list || list
Execute the first list. If false (failure), execute the second one. Example:
mkdir tempdir && cp workfile tempdir sshd || echo "sshd failed to start" You can use both forms together (with care) - they are processed left to right, and && must come first. Example:
mkdir tempdir && cp workfile tempdir || \ echo "Failed to create tempdir"
Execute the first list, and if true (success), execute the "then" list, otherwise execute the "else" list. The "elif" and "else" lists are optional. Example:
if [ -r $myfile ] then cat $myfile else echo $myfile not readable fi
Execute the first list and if true (success), execute the second list. Repeat as long as the first list is true. The until form just negates the test. Example: ex4 display, text
1: #!/bin/ksh 2: count=0
27 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
3: 4: 5: 6: 7: 8: 9:
max=10 while [[ $count -lt $max ]] do echo $count count=$((count + 1)) done echo "Value of count after loop is: $count"
Set identifier in turn to each word in words and execute the list. Omitting the "in words" clause implies using $@, i.e. the identifier is set in turn to each positional argument. Example:
for file in *.dat do echo Processing $file done
As with most programming languages, there are often several ways to express the same action. Running a command and then explicitly examining $? can be used instead of some of the above. Compound commands can be thought of as running in an implicit subshell. They can have I/O redirection independant of the rest of the script. Setting of variables in a real subshell does not leave them set in the parent script. Setting variables in implicit subshells varies in behaviour among shells. Older sh could not set variables in an implicit subshell and then use them later, but current ksh can do this (mostly). Example: ex11 display, text Reading a file line by line. The book by Randal Michael contains 12 example ways to read a file line by line, which vary tremendously in efficiency. This example shows the simplest and fastest way.
1: 2: 3: 4: 5: 6: 7: 8: 9: 10: 11: 12: 13: 14: 15: 16: 17: 18: 19: 20: 21: 22: #!/bin/sh # # # # Demonstrate reading a file line-by-line, using I/O redirection in a compound command Also test variable setting inside an implicit subshell. Test this under sh and ksh and compare the output.
line="TEST" save= if [ -z "$1" ]; then echo "Usage: $0 filename" else if [ -r $1 ]; then while read line; do echo "$line" save=$line done < $1 fi fi echo "End value of \$line is $line" echo "End value of \$save is $save"
28 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
(19)
Compare word with each pattern) in turn, and executes the first list for which the word matches. The patterns follow the same rules as for filename wildcards.
(ksh and bash only) A pattern-list is a list of one or more patterns separated from each other with a |. Composite patterns can be formed with one or more of the following:
?(pattern-list)
Example:
case $filename in *.dat) echo Processing a .dat file ;; *.sas) echo Processing a .sas file ;; *) # catch anything else that doesn't match patterns echo "Don't know how to deal with $filename" ;; esac
Break out of the current (or n'th) enclosing loop. Control jumps to the next statement after the loop
continue [n];
Resume iteration of the current (or n'th) enclosing loop. Control jumps to the top of the loop, which generally causes re-evaluation of a while or processing the next element of a for.
. filename
Read the contents of the named file into the current shell and execute as if in line. Uses $PATH to locate the file, and can be passed positional parameters. This is often used to read in shell functions that are common to multiple scripts. There are security implications if the pathname is not fully specified.
( ... )
Command grouping
29 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
Commands grouped in "( )" are executed in a subshell, with a separate environment (can not affect the variables in the rest of the script).
30 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
(20)
Works in all shells, and uses no extra processes Use `cut`: . Works in all shells, but inefficiently uses a pipe and external process for a trivial task.
if [ "`echo $var | cut -c1`" = "/" ] ; then
Works with ksh, bash and other POSIX-compliant shells. Not obvious if you have not seen this one before. Fails on old Bourne shells. Dave Taylor in "Wicked Cool Shell Scripts" likes this one. Use POSIX pattern match inside of [[...]]:
if [[ $var = /* ]]; then
Works with ksh, bash and other POSIX-compliant shells. Note that you must use [[...]] and no quotes around the pattern.
The [[...]] syntax is handled internally by the shell and can therefore interpret "wildcard" patterns differently than an external command. An unquoted wildcard is interpreted as a pattern to be matched, while a quoted wildcard is taken literally. The [...] syntax, even if handled internally, is treated as though it were external for backward compatability. This requires that wildcard patterns be expanded to matching filenames.
ksh93 and later versions, and bash, have a syntax for directly extracting substrings by character position. ${varname:start:length} Example: ex17 display, text
31 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
(21)
The args are read as input to the shell and the resulting command executed. Allows "double" expansion of some constructs. For example, constructing a variable name out of pieces, and then obtaining the value of that variable.
netdev=NETDEV_ NETDEV_1=hme0 # As part of an initialization step defining multiple devices
devnum=1 # As part of a loop over those devices ifname=$netdev$devnum # construct a variable name NETDEV_1 eval device=\$$ifname # evaluate it - device is set to hme0 exec command args
The command is executed in place of the current shell. There is no return from an exec. I/O redirection may be used. This is also used to change the I/O for the current shell.
:
The line is variable-expanded, but otherwise treated as a comment. Sometimes used as a synonym for "true" in a loop.
while :; do # this loop will go forever until broken by # a conditional test inside, or a signal done unset var ...
Remove the named variables. This is not the same as setting their values to null.
typeset [+/- options] [ name[=value] ] ...
(ksh only, bash uses declare for similar functions) Set attributes and values for shell variables and functions. When used inside a function, a local variable is created. Some of the options are:
-L[n]
Left justify and remove leading blanks. The variable always has length n if specified.
-R[n]
Right justify and fill with leading blanks. The variable always has length n if specified.
-l
The named variable is always treated as an integer. This makes arithmetic faster. The reserved word integer is an alias for typeset -i.
-Z[n]
32 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
33 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
(22)
Numeric variables
$(( integer expression ))
The $(( ... )) construction interprets the contents as an arithmetic expression (integer only). Variables are referenced by name without the "$". Most of the arithmetic syntax of the 'C' language is supported, including bit manipulations (*,/,+,-,|,&,<<,>>. Use parentheses for changing precedence). Examples
datapath=/data/public/project/trials/set1/datafile.dat filename=${datapath##*/} filename is
set to "datafile.dat" since the longest prefix pattern matching "*/" is the leading directory path (compare basename)
path=${datapath%/*}
is set to "/data/public/project/trials/set1" since the shortest suffix pattern matching "/*" is the filename in the last directory (compare dirname)
path i=$((i+1))
34 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
(23)
Shell Functions
All but the earliest versions of sh allow you define shell functions, which are visible only to the shell script and can be used like any other command. Shell functions take precedence over external commands if the same name is used. Functions execute in the same process as the caller, and must be defined before use (appear earlier in the file). They allow a script to be broken into maintainable chunks, and encourage code reuse between scripts.
Defining functions
identifier() { list; }
POSIX syntax for shell functions. Such functions do not restrict scope of variables or signal traps. The identifier follows the rules for variable names, but uses a separate namespace.
function identifier { list; }
Ksh and bash optional syntax for defining a function. These functions may define local variables and local signal traps and so can more easily avoid side effects and be reused by multiple scripts. A function may read or modify any shell variable that exists in the calling script. Such variables are global. (ksh and bash only) Functions may also declare local variables in the function using typeset or declare. Local variables are visible to the current function and any functions called by it.
return [n], exit [n]
Return from a function with the given value, or exit the whole script with the given value. Without a return, the function returns when it reaches the end, and the value is the exit status of the last command it ran. Example:
die() { # Print an error message and exit with given status # call as: die status "message" ["message" ...] exitstat=$1; shift for i in "$@"; do print -R "$i" done exit $exitstat }
Calling functions.
Functions are called like any other command. The output may be redirected independantly of the script, and arguments passed to the function. Shell option flags like -x are unset in a function - you must explicitly set them in each function to trace the execution. Shell functions may even be backgrounded and run asynchronously, or run as coprocesses (ksh).
35 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
Example:
[ -w $filename ] || \ die 1 "$file not writeable" "check permissions"
Example:
vprint() { # Print or not depending on global "$verbosity" # Change the verbosity with a single variable. # Arg. 1 is the level for this message. level=$1; shift if [[ $level -le $verbosity ]]; then print -R $* fi } verbosity=2 vprint 1 This message will appear vprint 3 This only appears if verbosity is 3 or higher
Reuseable functions
By using only command line arguments, not global variables, and taking care to minimise the side effects of functions, they can be made reusable by multiple scripts. Typically they would be placed in a separate file and read with the "." operator. Functions may generate output to stdout, stderr, or any other file or filehandle. Messages to stdout may be captured by command substitution (`myfunction`, which provides another way for a function to return information to the calling script. Beware of side-effects (and reducing reusability) in functions which perform I/O.
36 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
(24)
Advanced I/O
Unix I/O is performed by assigning file descriptors to files or devices, and then using those descriptors for reading and writing. Descriptors 0, 1, and 2 are always used for stdin, stdout and stderr respectively. Stdin defaults to the keyboard, while stdout and stderr both default to the current terminal window.
with no command, the exec just reassigns the I/O of the current shell.
exec n>outfile
The form n<, n> opens file descriptor n instead of the default stdin/stdout. This can then be used with read -u or print -u.
file descriptor n is set to whatever file descriptor 1 is currently pointing to. Example Sending messages to stderr (2) instead of stdout (1)
echo "Error: program failed" >&2
Echo always writes to stdout, but stdout can be temporarily reassigned to duplicate stderr (or other file descriptors). Conventionally Unix programs send error messages to stderr to keep them separated from stdout.
37 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
read a line from file descriptor n, parsing by $IFS, and placing the words into the named variables. Any left over words all go into the last variable. -p read from the pipe to a coprocess (opened by |&)
standard output is explicitly closed For example, to indicate to another program downstream in a pipeline that no more data will be coming. All file descriptors are closed when a script exits. I/O redirection operators are evaluated left-to-right. This makes a difference in a statement like: ">filename 2>&1". (Many books with example scripts get this wrong)
"Here" documents
<< [-]string
redirect input to the temporary file formed by everything up the matching string at the start of a line. Allows for placing file content inline in a script. Example: ex5 display, text
1: 2: 3: 4: 5: 6: 7: 8: 9: 10: 11: 12: 13: 14: 15: 16: 17: 18: #!/bin/sh echo "Example of unquoted here document, with variable and command substitution" cat <<EOF This text will be fed to the "cat" program as standard input. It will also have variable and command substitutions performed. I am logged in as $USER and today is `date` EOF echo echo "Example of quoted here document, with no variable or command substitution" # The terminating string must be at the start of a line. cat <<"EndOfInput" This text will be fed to the "cat" program as standard input. Since the text string marking the end was quoted, it does not get variable and command subsitutions. I am logged in as $USER and today is `date` EndOfInput
38 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
cat << EOP %!PS %%BeginFeature: *Duplex DuplexTumble <</Duplex true /Tumble false>> setpagedevice %%EndFeature EOP cat "$@" ## ) | lpr
39 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
(25)
We duplicate stdout to another file descriptor (3), then run the first command with stderr redirected to stdout and stdout redirected to the saved descriptor (3). The result is piped into other commands as needed. The output of the pipeline is redirected back to stderr, so that stdout and stderr of the script as a whole are what we expect.
1: 2: 3: 4: 5: 6: 7: 8: 9: 10: 11: 12: 13: 14: 15: 16: 17: 18: #!/bin/sh # Example 14 # Take stderr from a command and pass it into a pipe # for further processing. # Uses ex13.sh to generate some output to stderr # stdout of ex13 is processed normally # Save a copy of original stdout exec 3>&1 # stdout from ex13.sh is directed to the original stdout (3) # stderr is passed into the pipe for further processing. # stdout from the pipe is redirected back to stderr ./ex13.sh 2>&1 1>&3 3>&- | sed 's/stderr/STDERR/' 1>&2 # 3 is closed before running the command, just in case it cares # about inheriting open file descriptors.
This script uses nested subshells captured in backtics. Again we first duplicate stdout to another file descriptor (3). The inner subshell runs the first command, then writes the exit status to fd 4. The outer subshell redirects 4 to stdout so that it is captured by the backtics. Standard output from the first command (inner subshell) is passed into the pipeline as normal, but the final output of the pipeline is redirected to 3 so
40 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
that it appears on the original stdout and is not captured by the backtics. If any of the commands really care about inheriting open file descriptors that they don't need then a more correct command line closes the descriptors before running the commands.
1: 2: 3: 4: 5: 6: 7: 8: 9: 10: 11: 12: 13: 14: 15: 16: 17: 18: 19: 20: 21: 22: 23: 24: 25: 26: 27: 28: 29: 30: 31: 32: 33: #!/bin/sh # Example 15 # Uses ex13.sh to generate some output and give us an # exit status to capture. # Get the exit status of ex13 into $ex13stat. # stdout of ex13 is processed normally # Save a copy of stdout exec 3>&1 # Run a subshell, with 4 duplicated to 1 so we get it in stdout. # Capture the output in `` # ex13stat=`( ... ) 4>&1` # Inside the subshell, run another subshell to execute ex13, # and echo the status code to 4 # (./ex13.sh; echo $? >&4) # stdout from the inner subshell is processed normally, but the # subsequent output must be directed to 3 so it goes to the # original stdout and not be captured by the `` ex13stat=`((./ex13.sh; echo $? >&4) | grep 'foo' 1>&3) 4>&1` echo Last command status=$? echo ex13stat=$ex13stat # If any of the commands really care about inheriting open file # descriptors that they don't need then a more correct command line # closes the descriptors before running the commands exec 3>&1 ex13stat=`((./ex13.sh 3>&- 4>&- ; echo $? >&4) | \ grep 'foo' 1>&3 3>&- 4>&- ) 4>&1` echo Last command status=$? echo ex13stat=$ex13stat
41 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
# Close the extra descriptors before running the commands exec 3>&1 ex13stat=`((./ex13.sh 2>&1 1>&3 3>&- 4>&- ; echo $? >&4) | \ sed s/err/ERR/ 1>&2 3>&- 4>&- ) 4>&1` echo Last command status=$? echo ex13stat=$ex13stat
A practical application of this would be running a utility such as dd where the exit status is important to capture, but the error output is overly chatty and may need to be filtered before delivering to other parts of a script.
42 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
(26)
The special variable $! contains the process ID of the last background job that was started. You can save that and examine the process later (ps -p $bgpid) or send it a signal (kill -HUP $bgpid).
ksh coprocesses
Coprocesses are a way of starting a separate process which runs asychronously, but has stdin/stdout connected to the parent script via pipes.
command |&
Write to the pipe connected to the coprocess, instead of standard output Multiple coprocesses can be handled by moving the special file descriptors connected to the pipes onto standard input and output, and or to explicitly specified file descriptors.
exec <&p
The output from the coprocess is moved to standard output Example: ex9 display, text A script wants to save a copy of all output in a file, but also wants a copy to the screen. This is equivalent to always running the script as
script | tee outfile
1: 2: 3: 4: 5: 6: 7: 8: 9: 10: 11: 12: 13: #!/bin/ksh # If we have not redirected standard output, save a copy of # the output of this script into a file, but still send a # copy to the screen. if [[ -t 1 ]] ; then # Only do this if fd 1 (stdout) is still connected # to a terminal # We want the standard output of the "tee" process # to go explicitly to the screen (/dev/tty) # and the second copy goes into a logfile named $0.out
43 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
14: 15: 16: 17: 18: 19: 20: 21: 22: 23: 24:
tee $0.out >/dev/tty |& # Our stdout all goes into this coprocess exec 1>&p fi # Now generate some output print "User activity snapshot on $(hostname) at $(date)" print who
Example: ex10 display, text Start a coprocess to look up usernames in some database. It is faster to run a single process than to run a separate lookup for each user.
1: 2: 3: 4: 5: 6: 7: 8: 9: 10: 11: 12: 13: 14: 15: 16: 17: 18: 19: 20: 21: 22: 23: 24: 25: 26: 27: 28: 29: 30: 31: 32: 33: 34: 35: 36: 37: 38: 39: 40: 41: 42: #!/bin/ksh # This example uses a locally written tool for Dartmouth Name Directory lookups # Start the dndlookup program as a coprocess # Tell it to output only the canonical full name, and to not print multiple matches dndlookup -fname -u |& # move the input/output streams so we # can use other coprocesses too exec 4>&p exec 5<&p echo "Name file contents:" cat namefile echo # read the names from a file "namefile" while read uname; do print -u4 $uname read -u5 dndname case $dndname in *many\ matches*) # handle case where the name wasn't unique print "Multiple matches to \"$uname\" in DND" ;; *no\ match*) # handle case where the name wasn't found print "No matches to \"$uname\" in DND" ;; *) # we seem to have a hit - process the # canonical named retrieved from dndlookup print "Unique DND match: full name for \"$uname\" is \"$dndname\"" ;; esac sleep 2 done < namefile # We've read all the names, but the coprocess # is still running. Close the pipe to tell it # we have finished. exec 4>&-
44 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
(27)
handler is a command to be read (evaluated first) and executed on receipt of the specified sigs. Signals can be specified by name or number (see kill(1)) e.g. HUP, INT, QUIT, TERM. A Ctrl-C at the terminal generates a INT. A handler of - resets the signals to their default values A handler of '' (null) ignores the signals Special signal values are as follows:
EXIT
the handler is called when the function exits, or when the whole script exits. The exit signal has value 0. ERR (ksh) the handler is called when any command has a non-zero exit status DEBUG (ksh) the handler is called after each command. Example: ex8 display, text
1: 2: 3: 4: 5: 6: 7: 8: 9: 10: 11: 12: 13: 14: 15: 16: 17: 18: 19: 20: 21: 22: 23: 24: 25: 26: 27: 28: 29: #!/bin/bash # Try this under bash, ksh and sh trap huphandler HUP trap '' QUIT trap exithandler TERM INT huphandler() { echo 'Received SIGHUP' echo "continuing" } exithandler() { echo 'Received SIGTERM or SIGINT' exit 1 } ## Execution starts here - infinite loop until interrupted # Use ":" or "true" for infinite loop # SECONDS is built-in to bash and ksh. It is number of seconds since script started : is like a comment, but it is evaluated for side effects and evaluates to true seconds=0 while : ; do # while true; do sleep 5 seconds=$((seconds + 5)) echo -n "$SECONDS $seconds - " done
45 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
Exit handlers can be defined to clean up temporary files or reset the state of devices. This can be useful if the script has multiple possible exit points.
46 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
(28)
Always explicitly set $PATH at the start of a script, so that you know exactly which external programs will be used. If possible, don't use temporary files. If they cannot be avoided, use $TMPDIR, and create files safely (e.g. mktemp).
Often scripts will write to a fixed, or trivially generated temporary filename in /tmp. If the file already exists and you don't have permission to overwrite it, the script will fail. If you do have permission to overwrite it, you will delete the previous contents. Since /tmp is public write, another user may create files in it, or possibly fill it completely. Example: 1. A link is created by an unprivileged user in /tmp: /tmp/scratch -> /vmunix 2. A root user runs a script that blindly writes a scratch file to /tmp/scratch, and overwrites the operating system. Environment variable $TMPDIR is often used to indicate a preferred location for temporary files (e.g., a per-user directory). Some systems may use $TMP or $TEMP. Safe scratch files can be made by creating a new directory, owned and writeable only by you, then creating files in there. Example:
(umask 077 && mkdir /tmp/tempdir.$$) || exit 1
or (deluxe version)
tmp=${TMPDIR:-/tmp} tmp=$tmp/tempdir.$RANDOM.$RANDOM.$RANDOM.$$ (umask 077 && mkdir $tmp) || { echo "Could not create temporary directory" 1>&2 exit 1 }
Alternatively, many systems have mktemp to safely create a temporary file and return the filename, which can be used by the script and then deleted.
Check exit status of everything you do. Don't trust user input contents of files data piped from other programs file names. Output of filename generation with wildcards, or directly from ls or find Example: Consider the effects of a file named "myfile;cd /;rm *" if processed, unquoted, by your script.
One possible way to protect against weirdo characters in file names:
# A function to massage a list of filenames # to protect weirdo characters # e.g. find ... | protect_filenames | xargs command
47 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
# # We are backslash-protecting the characters \'" ?*; protect_filenames() { sed -es/\\\\/\\\\\\\\/g \ -es/\\\'/\\\\\'/g \ -es/\\\"/\\\\\"/g \ -es/\\\;/\\\\\;/g \ -es/\\\?/\\\\\?/g \ -es/\\\*/\\\\\*/g \ -es/\\\ /\\\\\ /g }
If using GNU find and xargs, there is a much cleaner option to null-terminate generated pathnames.
48 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
(29)
Style
Shell scripts are very frequently written quickly for a single purpose, used once and discarded. They are also as frequently kept and used many times, and migrate into other uses, but often do not receive the same level of testing and debugging that other software would be given in the same situation. It is possible to apply general principles of good software engineering to shell scripts. Preface scripts with a statement of purpose, author, date and revision notes Use a revision control system for complex scripts with a long lifetime Assume your script will have a long lifetime unless you are certain it won't Document any non-standard external utilities which your script needs Document your scripts with inline comments - you'll need them in a few months when you edit it. Treat standard input and output in the normal way, so that your script can be used in combination with other programs (the Unix toolkit philosophy) Be consistent in the format of your output, so that other programs can rely on it Use options to control behaviour such as verbosity of output. Overly chatty programs are very hard to combine with other utilities Use interactive features (prompting the user for information) very sparingly. Doing so renders the script unuseable in pipeline combinations with other programs, or in unattended operations. Test (a lot)
49 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
(30)
50 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
(31)
list contents of a directory, or list details of files and directories. mkdir; rmdir * Make and Remove directories. rm; cp; mv * Remove (delete), Copy and Move (rename) files and directories touch * Update the last modifed timestamp on a file, to make it appear to have just been written.
If the file does not exist, a new zero-byte file is created, which is often useful to signify that an event has occurred. tee
Make a duplicate copy of a data stream - used in pipelines to send one copy to a log file and a second copy on to another program. (Think plumbing).
* Echo the arguments to standard output -- used for messages from scripts. Some versions of "sh", and all csh/ksh/bash shells internalized "echo".
Conflicts sometimes arise over the syntax for echoing a line with no trailing CR/LF. Some use "\c" and some use option "-n". To avoid these problems, ksh also provides the "print" command for output.
cat
Copy and concatenate files; display contents of a file head, tail * Display the beginning of a file, or the end of it.
cut
Extract selected fields from each line of a file. Often awk is easier to use, even though it is a more complex program.
wc
51 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
* Various utilities to compress/uncompress individual files, combine multiple files into a single archive, or do both.
* Remove duplicate lines, and generate a count of repeated lines. Count lines, words and characters in a file.
* Display the current date and time (flexible format). Useful for conditional execution based on time, and for timestamping output.
ps
List the to a running processes. kill * Send a signal (interrupt) to a running process.
id
Print the user name and UID and group of the current user (e.g. to distinguish priviledged users before attempting to run programs which may fail with permission errors)
who
Display who is logged on the system, and from where they logged in. uname * Display information about the system, OS version, hardware architecture etc. mail * Send mail, from a file or standard input, to named recipients. Since scripts are often used to automate long-running background jobs, sending notification of completion by mail is a common trick.
logger
Place a message in the central system logging facility. Scripts can submit messages with all the facilities available to compiled programs.
hostname
Display the hostname of the current host - usful to keep track of where your programs are running
Conditional tests
52 de 56 13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
test; [
* The conditional test, used extensively in scripts, is also an external program which evaluates the expression given as an argument and returns true (0) or false (1) exit status. The name "[" is a link to the "test" program, so a line like:
if [ -w logfile ]
actually runs a program "[", with arguments "-w logfile ]", and returns a true/false value to the "if" command.
In ksh and most newer versions of sh, "[" is replaced with a compatible internal command, but the argument parsing is performed as if it were an external command. Ksh also provides the internal "[[" operator, with simplified syntax.
Stream Editing
awk
* A pattern matching and data manipulation utility, which has its own scripting language. It also duplicates much functionality from 'sed','grep','cut','wc', etc.
Complex scripts can be written entirely using awk, but it is frequently used just to extract fields from lines of a file (similar to 'cut').
sed
* Stream Editor. A flexible editor which operates by applying editing rules to every line in a data stream in turn.
Since it makes a single pass through the file, keeping only a few lines in memory at once, it can be used with infinitely large data sets. It is mostly used for global search and replace operations. It is a superset of 'tr', 'grep', and 'cut', but is more complicated to use.
tr
* Compare two files and list the differences between them. basename pathname Returns the base filename portion of the named pathname, stripping off all the directories dirname pathname Returns the directory portion of the named pathname, stripping off the filename
diff
53 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
* The "expr" command takes an numeric or text pattern expression as an argument, evaluates it, and returns a result to stdout. The original Bourne shell had no built-in arithmetic operators. E.g.
expr 2 + 1 expr 2 '*' '(' 21 + 3 ')'
Used with text strings, "expr" can match regular expressions and extract sub expressions. Similar functionality can be achived with sed. e.g.
expr SP99302L.Z00 : '[A-Z0-9]\{4\}\([0-9]\{3\}\)L\.*' dc
Desk Calculator - an RPN calculator, using arbitrary precision arithmetic and user-specified bases. Useful for more complex arithmetic expressions than can be performed internally or using expr
bc
A preprocessor for dc which provides infix notation and a C-like syntax for expressions and functions.
Merging files
paste
Perform a join (in the relational database sense) of lines in two sorted input files.
54 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
(32)
Books
The New KornShell Command And Programming Language, by Morris I. Bolsky, David G. Korn (Contributor). More info Learning the Korn Shell, 2nd Edn. by Bill Rosenblatt and Arnold Robbins. More info Korn Shell Programming by Example, by Dennis O'Brien, David Pitts (Contributor). More info The Korn Shell Linux and Unix Programming Manual (2nd Edn) by Anatole Olczak. More info Portable Shell Programming: An Extensive Collection of Bourne Shell Examples by Bruce Blinn. More
info Examples from this book can be downloaded for study.
Linux Shell Scripting with Bash by Ken O. Burtch. More info Unix Shell Programming by Stephen Kochan and Patrick Wood (third Edition). More info Teach yourself Shell Programming in 24 Hours, by S. Veeraraghavan. SAMS 2nd Edn. (2002) More info Mastering Unix Shell Scripting by Randal K. Michael, Wiley (2003) More info Light on basics, but develops
scripting through examples. Ksh only. Examples can be downloaded from the Wiley site (www.wiley.com/legacy/compbooks /michael/).
Wicked Cool Shell Scripts by Dave Taylor, No Starch Press (2004) More info Develops scripting entirely
through examples, drawn from Linux and OSX in addition to traditional Unix. Recommended, but not for beginners. Examples can be downloaded from the Intuitive site (www.intuitive.com/wicked/wicked-cool-shell-script-library.shtml).
Unix Power Tools, by S. Powers, J. Peek, T. O'Reilly, M. Loudikes et al. More info
Online Resources
Shelldorado (http://www.shelldorado.com)
Lots of links to scripting resources
Kornshell (http://www.kornshell.com)
The official Korn shell home page, with download links.
55 de 56
13-09-2011 11:17
http://www.dartmouth.edu/~rc/classes/ksh/print_pages.shtml
Cygwin (http://www.cygwin.com/)
A free Linux-like environment for Windows. Provides bash, command line utilities and DLLs. Developed by RedHat. An X server is also available.
56 de 56
13-09-2011 11:17