Iterative Methods
Iterative Methods
PAB3053
RESERVOIR MODELING AND SIMULATION
MAY 2017
1
Class schedule
Activity Time
2 hours (wk7)
2 hour (wk8)
2 hours (wk9)
2 hours (wk10)
2 hour (wk11)
6 hours (wk12-14)
2
Iterative Methods
When the number of equations is very large, the coefficient matrix is sparse but
not banded and the computer storage is critical, an iterative method is preferred to
the direct method of solution.
If the iterative process is convergent, the solution is obtained within a specified
accuracy of the exact answer in a finite but not predeterminable number of
operations. The method is certain to convergence for a system having diagonal
dominance.
Iterative methods have rather simple algorithms (easy to apply and not restricted
for use with simple geometries and B.Cs). Preferred when the number of
operations in the calculations is so large that the direct methods may prove
inadequate because of the accumulation of round-off errors.
3
Typical iterative methods
1. Jacobi method
2. Gauss-Seidel method
7. Multigrid Methods
4
Concept of iteration
Ax = b
In the case of an iterative solver, A is split as iollows :
A = C-R
where :
C = the approximate coefficient matrix
R = the residual matrix, representing the error in C
The iterative method is then defined as:
C x = Rx + b or ;
x (n +1) = x (n ) + C 1r (n +1)
and r (n +1) = b-Ax (n )
5
Iterative Methods
a11 x1 + a12 x2 + a13 x3 + a14 x4 = b1
a x + a x + a23 x3 + a24 x4 = b2
21 1 22 2
a31 x1 + a32 x2 + a33 x3 + a34 x4 = b3
a41 x1 + a42 x2 + a43 x3 + a44 x4 = b4
Can be converted into
j 1
Ax = b x = Cx
j
+ d ; Cii = 0
x and d are column vectors, and C is a square matrix
x ij xij 1
(1) =
a,i j
100% < s for all x i
xi
2 Norm of the residual vector Ax b <
( ) s
10
Jacobi method
The Jacobi Method is considered one of the basic Iterative methods
An iterative technique to solve Ax=b starts with an initial
approximation x (0) and generates a sequence x ( k ) { }k =0
First we convert the system Ax=b into an equivalent form
x = Tx + c
And generate the sequence of approximation by
x ( k ) = Tx ( k 1) + c, k = 1,2,3...
x ( k ) x ( k 1)
The stopping criterion: <
(k )
x
11
Jacobi method : Example
Consider the following set of equations.
10 x1 x2 + 2 x3 =6
x1 + 11x2 x3 + 3 x4 = 25
2 x1 x2 + 10 x3 x4 = 11
3 x2 x3 + 8 x4 = 15
Convert the set Ax = b in the form of x = Tx + c.
1 1 3
x1 = x2 x3 +
10 5 5
1 1 3 25
x2 = x1 + x3 x4 +
11 11 11 11
1 1 1 11
x3 = x1 + x2 + x4
5 10 10 10
3 1 15
x4 = x2 + x3 +
8 8 8 12
Jacobi method : Example
1 (0) 1 (0) 3
= +
(1)
x1 x2 x3
10 5 5
1 (0) 1 (0) 3 (0) 25
= + +
(1)
x2 x1 x3 x4
11 11 11 11
1 (0) 1 (0) 1 (0) 11
= x1 + +
(1)
x3 x2 x4
5 10 10 10
3 (0) 1 (0) 15
= x2 + +
(1)
x4 x3
8 8 8
= 0, x2 = 0, x3 = 0 and x4 = 0.
(0) (0) (0) (0)
x1
1 1 3
= +
(1)
x1 (0) (0)
= 0.6000,
10 5 5 (1)
1 1 3 25 x1
= + +
(1)
x2 (0) (0) (0)
11 11 11 11
= 2.2727,
(1)
1 1 1 11 x2
= (0) + +
(1)
x3 (0) (0)
5 10 10 10
= 1.1000
(1)
3 1 15 x3
= (0) + +
(1)
x4 (0)
8 8 8
= 1.8750
(1)
x4 13
Jacobi method : Example
1 (1) 1 (1) 3
= +
(2)
x1 x2 x3
10 5 5
1 (1) 1 (1) 3 (1) 25
= + x3 +
( 2)
x2 x1 x4
11 11 11 11
1 (1) 1 (1) 1 (1) 11
= x1 + +
( 2)
x3 x2 x4
5 10 10 10
3 (1) 1 (1) 15
= x2 + +
( 2)
x4 x3
8 8 8
1 ( k 1) 1 ( k 1) 3
x1 = x3 +
(k)
x2
10 5 5
1 ( k 1) 1 ( k 1) 3 ( k 1) 25
= + x3 +
(k )
x2 x1 x4
11 11 11 11
1 ( k 1) 1 ( k 1) 1 ( k 1) 11
= x1 + + x4
(k )
x3 x2
5 10 10 10
3 ( k 1) 1 ( k 1) 15
= x2 + x3 +
(k )
x4
8 8 8 14
Jacobi method : Example
Results:
iteration 0 1 2 3
(1) This is a very simple, efficient point-iterative procedure for solving large, sparse
systems of algebraic equations.
(2) The idea of GS is to compute x (k ) using most recently calculated values. In our
example:
1 ( k 1) 1 3
x1( k ) = x2 x3( k 1) +
10 5 5
1 (k ) 1 3 25
x2( k ) = x1 + x3( k 1) - x4( k 1) +
11 11 11 11
1 1 1 11
x3( k ) = - x1( k ) + x2( k ) + x4( k 1)
5 10 10 10
3 1 15
x4( k ) = - x2( k ) + x3( k ) +
8 8 8 16
Gauss-Seidel method
0
Starting iterations with =(0,0,0,0), we obtain:
17
Gauss-Seidel method, cont,
(2) Consider the following three equations:
a11P1+a12P2+a13P3=d1
a21P1+a22P2+a23P3=d2
a31P1+a32P2+a33P3=d3
where a 0 for i = 1 to 3
ii
18
Gauss-Seidel Method, cont,
(3) These guessed values are used together with the most recently
computed values to complete the first-round of iterations as
P1 (1)
=
1
a11
(d1 a12 P2(0 ) a13 P3(0 ) )
P2(1) =
1
a22
(d 2 a21 P1(1) a23 P3(0 ) )
P3(1) =
1
a33
(d 3 a31 P1(1) a32 P2(1) )
1 i 1
( ) ( )
n
xi( k ) = aij x (jk ) aij x (jk 1) + bi , i = 1, 2,...., n
aii j =1 j =i +1
19
Gauss-Seidel Method, cont,
(4) These first-approximations are used together with the most recently
computed values to complete the second-round of iterations as
P1(2 ) =
1
a11
(d1 a12 P2(1) a13 P3(1) )
P2(2 )
=
1
a22
(d 2 a21P1(2 ) a23 P3(1) )
P3(2 ) =
1
a33
(d 3 a31P1(2 ) a32 P2(2 ) )
20
Gauss-Seidel Method, cont,
We note that in each equation the largest element (in magnitude) is in the diagonal.
Example:
these equations are solved for the main diagonal unknowns as :
P1 = (17 P2 3P3 )
1
6 P1 + P2 + 3P3 = 17
6
P1 10 P2 + 4 P3 = 7
P2 = (7 + P1 + 4 P3 )
1
P1 + P2 + 3P3 = 12 10
P3 = (12 P1 P2 )
1
3
P2( 2 ) =
1
10
( )
7 + P1 + 4 P3 = 1.955
( 2) (1)
1
( )
P3( 2 ) = 12 P ( 2 )1 P2
3
( 2)
= 2.950
P1(3 ) =
1
6
( )
17 P2 3P3 = 1.032
( 2) ( 2)
P2(3 ) =
1
10
( )
7 + P1 + 4 P3 = 1.999
( 3) ( 2)
the values obtained with three itrations are sufficiently
close to the exact answer, P1 = 1, P2 = 2, P3 = 3
P3(3 )
1
( )
= 12 P ( 3)1 P2 = 2.989
3
( 3)
22
Successive Over-Relaxation (SOR)
(1) The Gauss-Seidel method generally does not converge sufficiently fast.
Successive over-relaxation is a method that can accelerate the convergence.
(2) The basic idea in this approach is
1
T1( n +1) = T1( n ) + ( )
d1 a11T1( n ) a12T2( n ) a13T3( n )
a11
1
T2(n +1) = T2n + (d 2 a21T1( n +1) a22T2( n ) a23T3( n ) )
a22
1 (n )
T3( n +1) (n )
= T3 + (
d 3 a31T1 ( n +1)
a32T2( n +1)
a33T3 )
a33
23
Successive Over-Relaxation (SOR), cont,
24
Successive Over-Relaxation (SOR), cont,
(5) The above procedure for SOR can be generalized for the
case of M equations as
25
SOR: Example
4 x1 + 3 x 2 = 24
3 x1 + 4 x 2 x3 = 30
x 2 + 4 x3 = 24 Exact Solution: x=(3, 4, -5)
26
In summary
Jacobi Method
new
x3 = (b3 a31 x1 a32 x2 a34 x4 ) / a33
old old old
new
x4 = (b4 a41 x1 a42 x2 a43 x3 ) / a44
old old old
27
In summary
Gauss-Seidel Method
Differ from Jacobi method by sequential updating: use new xi
immediately as they become available
x 1 new = ( b1 a12 x 2
old
a13 x 3
old
a14 x 4
old
) / a11
new
x2 = ( b2 a21 x 1 a23 x 3 a24 x 4
new old old
) / a22
new
= ( b3 a31 x 1 a32 x 2 a34 x 4
new new old
x3 ) / a33
new
= ( b4 a41 x 1 a42 x 2 a43 x 3
new new new
x4 ) / a44
28
In summary
Gauss-Seidel Method
use new xi at jth iteration as soon as they become available
j j 1
=
j j
3
x ( b3 a x
31 1 a x
32 2 a x
34 4 ) / a33
j
x4 = (b4 a41 x1 a42 x2 a43 x3 ) / a44
j j j
x ij xij 1
a,i = j
100% < s for all x i
xi
29
Diagonally Dominant Matrix
8 1 2 3
1 6 2 5
A=
1 6 12 3 is not diagonally dominant
3 2 3 9
30
Jacobi and Gauss-Seidel
Example: 5 x1 2 x2 + 2 x3 = 10
2 x1 4 x2 x3 = 7
3x x + 6 x = 12
1 2 3
Jacobi Gauss-Seidel
new
new 2 old 2 old 10
x1 = x2 x3 +
2 old 2 old 10
x
1 = x2 x3 + 5 5 5
5 5 5
new new 2 new 1 old 7
x2 =
2 old
x1
1 old 7
x3 + 2x = x1 x3 +
4 4 4 4 4 4
5 x1 + 12 x3 = 80 5 0 12
4 x1 x2 x3 = 2 4 1 1
6 x1 + 8 x2 = 45 6 8 0
Not diagonally dominant !!
Order of the equation can be important
Rearrange the equations to ensure convergence
4 x1 x2 x3 = 2 4 1 1
6 x1 + 8 x2 = 45 6 8 0
5 x1 + 12 x3 = 80 5 0 1232
Gauss-Seidel Iteration
x1 = ( x2 + x3 2) / 4
Rearrange x2 = ( 45 6 x1 ) / 8
x = (80 + 5 x ) / 12
3 1
Assume x1 = x2 = x3 = 0
x1 = (0 + 0 2) / 4 = 0.5
x2 = [45 6 ( 0.5)] / 8 = 6.0
First
iteration
x = [80 + 5 ( 0.5)] / 12 = 6.4583
3
33
Gauss-Seidel Method
x1 = ( 2 + 6 + 6.4583) / 4 = 2.6146
Second iteration x2 = ( 45 6( 2.6146)) / 8 = 3.6641
x = (80 + 5( 2.6146)) / 12 = 7.7561
3
x1 = ( 2 + 3.6641 + 7.7561) / 4 = 2.3550
Third iteration x2 = ( 45 6( 2.3550)) / 8 = 3.8587
x = (80 + 5( 2.3550)) / 12 = 7.6479
3
x1 = ( 2 + 3.8587 + 7.6479) / 4 = 2.3767
Fourth iteration x2 = ( 45 6( 2.3767)) / 8 = 3.8425
x = (80 + 5( 2.3767)) / 12 = 7.6569
3
5th : x1 = 2.3749, x2 = 3.8439, x3 = 7.6562
6th : x1 = 2.3750, x2 = 3.8437, x3 = 7.6563
7th : x1 = 2.3750, x2 = 3.8438, x3 = 7.6562 34
Gauss-Seidel Iteration
A = [4 1 1; 6 8 0; -5 0 12];
b = [-2 45 80];
x=Seidel(A,b,x0,tol,100);
i x1 x2 x3 x4 ....
1.0000 -0.5000 6.0000 6.4583
2.0000 2.6146 3.6641 7.7561
3.0000 2.3550 3.8587 7.6479
4.0000 2.3767 3.8425 7.6569
5.0000 2.3749 3.8439 7.6562
6.0000 2.3750 3.8437 7.6563
7.0000 2.3750 3.8438 7.6562
8.0000 2.3750 3.8437 7.6563
Gauss-Seidel method converged
x new
i = x new
i + (1 )x old
i
37
Successive Over Relaxation (SOR)
Relaxation method
(
G S method x2new = b2 a21 x1new a23 x3old a24 x4old a22 )
SOR method x2new = (1 )x2old + x2old
( )
= (1 )x2old + b2 a21 x1new a23 x3old a24 x4old a22
x1new = (1 )x1old + (b1 a12 x 2 old a13 x 3old a14 x 4 old )/a 11
new
x 2 = (1 )x 2 + (b 2 a 21x1 a 23 x 3 a 24 x 4 )/a 22
old new old old
new
x 3 = (1 )x 3 + (b 3 a 31x1 a 32 x 2 a 34 x 4 )/a 33
old new new old
new
4
x = (1 )x 4
old
+ (b 4 a x
41 1
new
a x
42 2
new
a x
43 3
new
)/a 44
38
SOR Iterations
x1 = ( x2 + x3 2) / 4
Rearrange x2 = ( 45 6 x1 ) / 8
x = (80 + 5 x ) / 12
3 1
xi = x GS
i + (1 ) x ; G - S : Gauss - Seidel
old
i
40
Optimized
41
Current Methods in Use
The solver requiring a large computer work limits the size of the
problem that can be treated.