Algorithms
Algorithms
net/publication/225961350
Algorithms
CITATIONS READS
0 907
1 author:
R. K. Shyamasundar
Indian Institute of Technology Bombay
308 PUBLICATIONS 1,883 CITATIONS
SEE PROFILE
All content following this page was uploaded by R. K. Shyamasundar on 21 May 2014.
Algorithms
6. Algorithms for Sorting and Searching
R K Shyamasundar
Sorting
Array data two adjacent elements are found to be in the wrong order (say
structure ith element is larger than (i+1)th element) then they are
represents a exchanged (swapped). Repeat the process till no exchanges are
sequence of possible — at this point, the numbers are in the ascending order
elements each of of their magnitude!
which can be
accessed by the In the sequel, we use the for-loop control construct for ease of
designated indices. reading. The construct
for i:= 1 to N do stat-body endfor
is interpreted as follows:
i: = 1 ;
while i ≤ N do
stat–body; i:= i+1;
endwhile
Note
The maximum The actual code used for swapping the two values A[ j] and
number sinks to A[ j+1] can be replaced by
the bottom while
the smaller one k:= j+1; (∗ k is a new local variable ∗)
bubbles up. Hence call swap ( j,k)
the name 'bubble
sort.' Further, the reader should note that in the replaced code (i) a
new local variable k has been introduced, and (ii) indices have
been used as formal parameters for the swap procedure rather
than the array variables, to avoid the intricacies in understand-
ing the substitution of formal parameters with actual parameters.
Time Complexity
Search
Sequential Search
will accept as its input the number key and will yield index j
where key appears. In other words, the output will be the index
of the array if the given element key appears in the array and will
be 0 otherwise. The simplest algorithm is to compare the elements
of the array with the given key starting from the last element till
either a match is found or all the elements are exhausted. The
reason for starting from the last element lies in the problem
description that the output for failure to find the element is 0
(which we obtain naturally if we start from the last element).
Using the notations given in the earlier articles,the program is
given in Table 3. In the program,we have used the boolean
variable continue for checking the need to do further search as
in the given array.
In the program shown, the reader should take care to see that at
no point is one checking the contents of the array A whenever
the index i is outside the lowest and highest indices of the given
array. It may be noted that the same program can in fact be used
over an array of strings instead of an array of integers; in that
case, the program would correspond to finding an entry in a
database or bibliography.
Sequential search It must be evident to the reader that the algorithm takes at the
is not an efficient worst n steps and this situation occurs when the search fails
method of search (i.e., there is no entry corresponding to the given key). The
unless the number sequential search is not an efficient method of search unless the
of entries to be number of entries to be searched are few. If we assume that the
searched are few. given array is already sorted in ascending order, then one can
arrive at an efficient method of search referred to as the binary
search which is discussed next.
Binary Search
Time Complexity
Divide–and– greater than n; that is, worst time complexity is of the order
conquer often log(n) (because log2(n) < r). For instance, if we search an array
dramatically (or a table) having 210 entries, the comparison is done at most 10
enhances the times; this is indeed remarkable as compared to the number of
performance of comparisons for the previous algorithm. Of course, we should
algorithms. remember that the array is ordered (which is done only once
for that matter).
Divide–and–Conquer
Note: For the sake of simplicity we have not given the actual
assignments for the indices in the step of splitting the array.
Since we have assumed the number of elements in the array to
be a power of two, we have U– L +1 = 2m for some m. The
values of the indices of the two arrays to which A is split are
l1 := L; U1 = 2 m–1
l2 := 2 m–1 +1; U2 = 2m
⎧1, if n = 2,
⎪
T (n) = ⎨
⎩2 × T ( n / 2) + 2,
⎪ if n > 2.
11 B > C > A
A binary tree of essentially a decision tree where comparisons are done at each
height h has at node. If the answer to the question asked at the node is 'yes'
most 2h leaves. then it branches to the left; otherwise, it branches to the right.
The branching terminates if there are no more questions to ask;
in this case, termination amounts to arriving at the order among
the elements. It can be seen that possible orderings among
three distinct elements are 6 (which is nothing but 3!). Assum-
ing the test at each node is binary ( 'yes' or 'no' answers), we can
derive the following results about such decision trees.
The above bound fixes the lower bound. Now, if can obtain an
algorithm with the lower bound as its complexity, then it must
be clear that the algorithm will be optimal within some constant
of proportionality.
Code Tuning
Some programmers worry too much about efficiency and try to optimize even little things and thus
create a clever program that is hard to understand and maintain. On the other hand, some programmers
pay too little attention to efficiency and performance and create a beautiful structured program that is
too inefficient and hence useless. Good programmers keep a good perspective on efficiency; it is just
one of the many important problems the software design encounters. Code Tuning locates the
expensive parts of an existing program and makes little changes to improve its performance. Even
though the approach is not glamorous and is not always the right approach, quite often it makes a big
difference in program performance. More about these aspects can be found in the book of Jon Bentley.
Suggested Reading
Table 8 Outline for quicksort.
D E Knuth. Fundamental
Algorithms. Addison- procedure QUICKSORT( A, L, U); (∗ A is the array to be sorted
Wesley. Reading. Mass., L, U are lower and highest indices ∗)
1968. if U=L then A is already sorted (∗ A has only one element ∗)
N Wirth. Systematic
else
Programming: An Intro-
duction.Prentice-Hall. Select an item k in the array and let p be its final position;
Englewood Cliffs. New Let A1 and A2 be the subarrays got from the elements of
Jersey, 1972. A that are less than and greater than k respectively;
D E Knuth. Sorting and
QUICKSORT( A, 1, p–1); (∗ Defines A1 ∗)
Searching. Addison-Wes-
ley. Reading. Mass., 1973. QUICKSORT( A, p+1,U); (∗ Defines A2 ∗)
N Wirth. Algorithms + endif
Data Structures = Pro- endprocedure
grams. Prentice-Hall.
Englewood Cliffs. New
Jersey, 1976.
in partitioning the array, A, into two parts, say A1, A2 with
R G Dromey. How to
Solve it by Computer, respect to an element of the array, say k, such that the elements
Prentice Hall in A1 are less than k and the elements in A2 are larger than k.
International, 1982. Now, the sorting of the original array is obtained by concatenating
J Bentley. Programming
the sorted A1 to the left of element k and the sorted A2 to the
Pearls. Addison-Wesley.
Reading. Mass., 1986. right of element k. The procedure is applied recursively. The
outline of the algorithm for a given array of distinct elements is
given in Table 8.