Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
184 views

Continuum Mechanics - Tensors

The document provides an introduction to tensors and their properties. It discusses that tensors are linear mappings that take vectors as inputs and output other vectors. Tensors can be represented by their components in a basis as matrices. While matrices can represent tensors, not all matrices represent tensors as the components must transform consistently under a change of basis. Tensors can be constructed using dyadic products of vectors. Basic tensor operations like addition are performed by adding corresponding components.

Uploaded by

Brian Wood
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
184 views

Continuum Mechanics - Tensors

The document provides an introduction to tensors and their properties. It discusses that tensors are linear mappings that take vectors as inputs and output other vectors. Tensors can be represented by their components in a basis as matrices. While matrices can represent tensors, not all matrices represent tensors as the components must transform consistently under a change of basis. Tensors can be constructed using dyadic products of vectors. Basic tensor operations like addition are performed by adding corresponding components.

Uploaded by

Brian Wood
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

Continuum Mechanics - Tensors

http://www.brown.edu/Departments/Engineering/Courses/En221/No...

Home

A Brief Introduction to Tensors and their properties

1. BASIC PROPERTIES OF TENSORS

1.1 Examples of Tensors


The gradient of a vector field is a good example of a second-order tensor. Visualize a vector field: at every point in space,
the field has a vector value u(x1 , x2 , x3 ). Let G = u represent the gradient of u. By definition, G enables you to
calculate the change in u when you move from a point x in space to a nearby point at x + dx:
du = G dx

G is a second order tensor. From this example, we see that when you multiply a vector by a tensor, the result is another
vector.
This is a general property of all second order tensors. A tensor is a linear mapping of a vector onto another vector.
Two examples, together with the vectors they operate on, are:
The stress tensor

t= n

where n is a unit vector normal to a surface, is the stress tensor and t is the traction vector acting on the surface.
The deformation gradient tensor

dw = F dx

where dx is an infinitesimal line element in an undeformed solid, and dw is the vector representing the deformed
line element.

1.2 Matrix representation of a tensor


To evaluate and manipulate tensors, we express them as components in a basis, just as for vectors. We can use the
displacement gradient to illustrate how this is done. Let u(x1 , x2 , x3 ) be a vector field, and let G = u represent the
gradient of u. Recall the definition of G
du = G dx

Now, let {e1 , e2 , e3 } be a Cartesian basis, and express both du and dx as components. Then, calculate the components of
du in terms of dx using the usual rules of calculus

1 of 12

1/4/17, 12:46 PM

Continuum Mechanics - Tensors

http://www.brown.edu/Departments/Engineering/Courses/En221/No...

d u1 =
d u2 =
d u3 =

We could represent this as a matrix product

u1
x1
u2
x1
u3
x1

d x1 +
d x1 +
d x1 +

du1


du2 =


du3

Alternatively, using index notation

u1
x2
u2
x2
u3
x2

d x2 +
d x2 +
d x2 +

u1
x1

u1
x2

u1
x3

u3
x1

u3
x2

u3
x3

u2
x1

d ui =

u2
x2

ui
xj

u2
x3

u1
x3
u2
x3
u3
x3

d x3
d x3
d x3

dx1

dx2

dx3

d xj

From this example we see that G can be represented as a 3 3 matrix. The elements of the matrix are known as the
components of G in the basis {e1 , e2 , e3 }. All second order tensors can be represented in this form. For example, a
general second order tensor S could be written as

S11 S12 S13

S S21 S22 S23

S31 S32 S33

You have probably already seen the matrix representation of stress and strain components in introductory courses.
Since S can be represented as a matrix, all operations that can be performed on a 3 3 matrix can also be performed on
S. Examples include sums and products, the transpose, inverse, and determinant. One can also compute eigenvalues and
eigenvectors for tensors, and thus define the log of a tensor, the square root of a tensor, etc. These tensor operations are
summarized below.
Note that the numbers S11 , S12 , S33 depend on the basis {e1 , e2 , e3 }, just as the components of a vector depend on the
basis used to represent the vector. However, just as the magnitude and direction of a vector are independent of the basis,
so the properties of a tensor are independent of the basis. That is to say, if S is a tensor and u is a vector, then the vector

v=Su

has the same magnitude and direction, irrespective of the basis used to represent u, v, and S.

1.3 The difference between a matrix and a tensor


If a tensor is a matrix, why is a matrix not the same thing as a tensor? Well, although you can multiply the three
components of a vector u by any 3 3 matrix,

b1 a11 a12 a13 u1




b2 = a21 a22 a23 u2


b3 a31 a32 a33 u3

the resulting three numbers (b1 , b2 , b3 ) may or may not represent the components of a vector. If they are the components
of a vector, then the matrix represents the components of a tensor A, if not, then the matrix is just an ordinary old matrix.
To check whether (b1 , b2 , b3 ) are the components of a vector, you need to check how (b1 , b2 , b3 ) change due to a change
of basis. That is to say, choose a new basis, calculate the new components of u in this basis, and calculate the new matrix
in this basis (the new elements of the matrix will depend on how the matrix was defined. The elements may or may not
change if they dont, then the matrix cannot be the components of a tensor). Then, evaluate the matrix product to find a
new left hand side, say (1 , 2 , 3 ) . If (1 , 2 , 3 ) are related to (b1 , b2 , b3 ) by the same transformation that was used to
calculate the new components of u, then (b1 , b2 , b3 ) are the components of a vector, and, therefore, the matrix represents
2 of 12

1/4/17, 12:46 PM

Continuum Mechanics - Tensors

http://www.brown.edu/Departments/Engineering/Courses/En221/No...

the components of a tensor.


1.4 Formal definition
Tensors are rather more general objects than the preceding discussion suggests. There are various ways to define a tensor
formally. One way is the following:
A tensor is a linear vector valued function defined on the set of all vectors
More specifically, let S(v) denote a tensor operating on a vector. Linearity then requires that, for all vectors v, w and
scalars
S(v + w) = S(v) + S(w)
S(v) = S(v)
Alternatively, one can define tensors as sets of numbers that transform in a particular way under a change of coordinate
system. In this case we suppose that n dimensional space can be parameterized by a set of n real numbers xi . We could
change coordinate system by introducing a second set of real numbers x i (xk ) which are invertible functions of xi .
Tensors can then be defined as sets of real numbers that transform in a particular way under this change in coordinate
system. For example

A tensor of zeroth rank is a scalar that is independent of the coordinate system.

A covariant tensor of rank 1 is a vector that transforms as v i =

A contravariant tensor of rank 1 is a vector that transforms as

A covariant tensor of rank 2 transforms as S ij =

A contravariant tensor of

A mixed tensor of rank 2

xi xj
S
x k x l kl
x
x
kl
i
rank 2 transforms as S = x x i
k
l
x xj
i
transforms as S j = x k x Slk
i
l

xj
v
x i j
x
v i = x i vj
j

S ij

Higher rank tensors can be defined in similar ways. In solid and fluid mechanics we nearly always use Cartesian tensors,
(i.e. we work with the components of tensors in a Cartesian coordinate system) and this level of generality is not needed
(and is rather mysterious). We might occasionally use a curvilinear coordinate system, in which we do express tensors in
terms of covariant or contravariant components this gives some sense of what these quantities mean. But since solid
and fluid mechanics live in Euclidean space we dont see some of the subtleties that arise, e.g. in the theory of general
relativity.

1.5 Creating a tensor using a dyadic product of two vectors.


Let a and b be two vectors. The dyadic product of a and b is a second order tensor S denoted by
with the property

S=ab

S u = (a b) u = a(b u)

Sij = ai bj .

Sij uj = (ai bk )uk = ai (bk uk )

for all vectors u. (Clearly, this maps u onto a vector parallel to a with magnitude |a| (b u) )
The components of a b in a basis {e1 , e2 , e3 } are

3 of 12

1/4/17, 12:46 PM

Continuum Mechanics - Tensors

http://www.brown.edu/Departments/Engineering/Courses/En221/No...

a1 b1 a1 b2 a1 b3

a2 b1 a2 b2 a2 b3

a3 b1 a3 b2 a3 b3
Note that not all tensors can be constructed using a dyadic product of only two vectors (this is because (a b) u always
has to be parallel to a, and therefore the representation cannot map a vector onto an arbitrary vector). However, if a, b,
and c are three independent vectors (i.e. no two of them are parallel) then all tensors can be constructed as a sum of scalar
multiples of the nine possible dyadic products of these vectors.

2. OPERATIONS ON SECOND ORDER TENSORS


Tensor components.
Let {e1 , e2 , e3 } be a Cartesian basis, and let S be a second order tensor. The components of S in {e1 , e2 , e3 } may be
represented as a matrix

where

S11 S12 S13

S21 S22 S23

S31 S32 S33

S11 = e1 (S e1 ) ,
S21 = e2 (S e1 ) ,
S31 = e3 (S e1 ) ,

S12 = e1 (S e2 ) ,
S22 = e2 (S e2 ) ,
S32 = e3 (S e2 ) ,

S11 = e1 (S e3 ) ,
S21 = e2 (S e3 ) ,
S31 = e3 (S e3 ) ,

The representation of a tensor in terms of its components can also be expressed in dyadic form as
S=

j=1 i=1

Sij ei ej

This representation is particularly convenient when using polar coordinates, or when using a general non-orthogonal
coordinate system.

Addition
Let S and T be two tensors. Then U = S + T is also a tensor.
Denote the Cartesian components of U, S and T by matrices as defined above. The components of U are then related to
the components of S and T by

U11 U12 U13 S11 + T11 S12 + T12 S13 + T13

U21 U22 U23 = S21 + T21 S22 + T22 S23 + T23

U31 U32 U33 S31 + T31 S32 + T32 S33 + T33

In index notation we would write

Uij = Sij + Tij

Product of a tensor and a vector


Let u be a vector and S a second order tensor. Then
is a vector.

v=Su

Let (u1 , u2 , u3 ) and (v1 , v2 , v3 ) denote the components of vectors u and v in a Cartesian basis {e1 , e2 , e3 }, and denote the
4 of 12

1/4/17, 12:46 PM

Continuum Mechanics - Tensors

http://www.brown.edu/Departments/Engineering/Courses/En221/No...

Cartesian components of S as described above. Then

v1 S11 S12 S13 u1 S11 u1 + S12 u2 + S13 u3



v2 = S21 S22 S23 u2 = S21 u1 + S22 u2 + S23 u3



v3 S31 S32 S33 u3 S31 u1 + S32 u2 + S33 u3

Alternatively, using index notation

vi = Sij uj

The product
is also a vector. In component form

[ v1 v2 v3 ] = [ u1 u2
or

v=uS

S11 S12 S13 u1 S11 + u2 S21 + u3 S31

u3 ] S21 S22 S23 = u1 S12 + u2 S22 + u3 S32

S31 S32 S33 u1 S13 + u2 S23 + u3 S33


vi = uj Sji

Observe that u S S u (unless S is symmetric).


Product of two tensors

Let T and S be two second order tensors. Then U = T S is also a tensor.


Denote the components of U, S and T by 3 3 matrices. Then,
U11 U12 U13 T11 T12


U21 U22 U23 = T21 T22


U31 U32 U33 T31 T32
T11 S11 + T12 S21 + T13 S31

= T21 S11 + T22 S21 + T23 S31

T31 S11 + T32 S21 + T33 S31

Alternatively, using index notation

T13 S11 S12

T23 S21 S22

T33 S31 S32


T11 S12 + T12 S22
T21 S12 + T22 S22
T31 S12 + T32 S22

S13

S23

S33
+ T13 S32 T11 S13 + T12 S23 + T13 S33

+ T23 S32 T21 S12 + T22 S22 + T23 S32

+ T33 S32 T31 S13 + T32 S23 + T33 S33

Uij = Tik Skj

Note that tensor products, like matrix products, are not commutative; i.e. T S S T
Transpose
Let S be a tensor. The transpose of S is denoted by ST and is defined so that
u ST = S u

Denote the components of S by a 3x3 matrix. The components of ST are then


S11 S21 S31

S S12 S22 S32

S13 S23 S33


T

i.e. the rows and columns of the matrix are switched.


Note that, if A and B are two tensors, then

5 of 12

(A B)T = BT AT

1/4/17, 12:46 PM

Continuum Mechanics - Tensors

http://www.brown.edu/Departments/Engineering/Courses/En221/No...

Trace
Let S be a tensor, and denote the components of S by a 3 3 matrix. The trace of S is denoted by tr(S) or trace(S), and
can be computed by summing the diagonals of the matrix of components
trace (S) = S11 + S22 + S33

More formally, let {e1 , e2 , e3 } be any Cartesian basis. Then

trace (S) = e1 S e1 + e2 S e2 + e3 S e3

The trace of a tensor is an example of an invariant of the tensor you get the same value for trace(S) whatever basis you
use to define the matrix of components of S.
In index notation, the trace is written Skk
Contraction.
Inner Product: Let S and T be two second order tensors. The inner product of S and T is a scalar, denoted by S : T.
Represent S and T by their components in a basis. Then

In index notation S : T Sij Tij

S : T = S11 T11 + S12 T12 + S13 T13


+ S21 T21 + S22 T22 + S23 T23
+ S31 T31 + S32 T32 + S33 T33

Observe that S : T = T : S, and also that S : I = trace(S) , where I is the identity tensor.
Outer product: Let S and T be two second order tensors. The outer product of S and T is a scalar, denoted by S T.
Represent S and T by their components in a basis. Then

In index notation S T Sij Tji

S T = S11 T11 + S21 T12 + S31 T13


+ S12 T21 + S22 T22 + S32 T23
+ S13 T31 + S23 T32 + S33 T33

Observe that S T = ST : T
Determinant
The determinant of a tensor is defined as the determinant of the matrix of its components in a basis. For a second order
tensor

S11 S12 S13

det S = det S21 S22 S23

S31 S32 S33


= S11 (S22 S33 S23 S32 ) + S12 (S23 S31 S21 S33 ) + S13 (S21 S32 S31 S22 )

In index notation this would read

6 of 12

det(S) =

1
6

ijk lmn Sli Smj Snk

1/4/17, 12:46 PM

Continuum Mechanics - Tensors

http://www.brown.edu/Departments/Engineering/Courses/En221/No...

Note that if S and T are two tensors, then

det(S) = det (ST )

det(S T) = det(S) det(T)

Inverse
Let S be a second order tensor. The inverse of S exists if and only if det(S) 0, and is defined by
where S

S1 S = I

denotes the inverse of S and I is the identity tensor.

The inverse of a tensor may be computed by calculating the inverse of the matrix of its components. Formally, the inverse
of a second order tensor can be written in a simple form using index notation as

Sji1 =

1
2 det(S)

ipq jkl Spk Sql

In practice it is usually faster to compute the inverse using methods such as Gaussian elimination.

Change of Basis.
Let S be a tensor, and let {e1 , e2 , e3 } be a Cartesian basis. Suppose that the components of S in the basis {e1 , e2 , e3 } are
known to be

S (e) S (e) S (e)


12
13
11

(e)
(e)
(e)
(e)
[S ] = S21 S22 S23
(e)

(e)
(e)
S33
S31 S32

Now, suppose that we wish to compute the components of S in a second Cartesian basis, {m1 , m2 , m3 } . Denote these
components by

S (m) S (m) S (m)


12
13
11

(m)
(m)
(m)
(m)
[S ] = S21 S22 S23
(m)

(m)
(m)
S33
S31 S32

To do so, first compute the components of the transformation matrix [Q]

m1 e1

[Q] = m2 e1

m3 e1

m1 e2
m2 e2
m3 e2

m1 e3

m2 e3

m3 e3

(this is the same matrix you would use to transform vector components from {e1 , e2 , e3 } to {m1 , m2 , m3 } ). Then,
or, written out in full

S (m) S (m) S (m)


12
13
11
m1 e1
(m)
(m)
(m)
S21
S22 S23 = m2 e1
(m)

(m)
(m)
S33
S31 S32
m3 e1

[S (m) ] = [Q][S (e) ][Q]T

m1 e2
m2 e2
m3 e2

To prove this result, let u and v be vectors satisfying

(e)
(e)
(e)
m1 e3 S11 S12 S13 m1 e1
(e)

(e)
(e)
m2 e3 S21 S22
m e
S23

1 2

m3 e3 S (e) S (e) S (e) m1 e3


31
32
33

m2 e1
m2 e2
m2 e3

m3 e1

m3 e2

m3 e3

v=Su

Denote the components of u and v in the two bases by u(e) , u(m) and v(e) , v(m) , respectively. Recall that the vector


components are related by

7 of 12

1/4/17, 12:46 PM

Continuum Mechanics - Tensors

http://www.brown.edu/Departments/Engineering/Courses/En221/No...

u(m) = [Q]u(e)

(m)
v = [Q]v(e)

Now, we could express the tensor-vector product in either basis

(m)
u(e) = [Q
] u

(m)
v(e) = [Q
] v

v(m) = S (m) ] u(m)

v(e) = S (e) u(e)


[ ]

Substitute for u(e) , v(e) from above into the second of these two relations, we see that

T (m)
T (m)
(e)

[Q] v = [S ] [Q] u

Recall that

[Q] [Q]T = [I]

so multiplying both sides by [Q] shows that


so, comparing with the first of equation (1)
as stated.
In index notation, we would write

[I] v(m) = v(m)


v(m) = [Q] [S (e) ] [Q]T u(m)

[S (m) ] = [Q][S (e) ][Q]T

Sij(m) = Qik Skl(e) Qjl

Qij = mi ej

Another, perhaps cleaner, way to derive this result is to expand the two tensors as the appropriate dyadic products of the
basis vectors

Skl(m) mk ml = Skl(e) ek el

mi [Skl(m) mk ml ] mj = mi Skl(e) ek el mj
Sij(m) = (mi ek )Skl(e) (el mj )

Invariants
Invariants of a tensor are scalar functions of the tensor components which remain constant under a basis change. That is
to say, the invariant has the same value when computed in two arbitrary bases {e1 , e2 , e3 } and {m1 , m2 , m3 } . A
symmetric second order tensor always has three independent invariants.
Examples of invariants are
1. The three eigenvalues
2. The determinant
3. The trace
4. The inner and outer products
These are not all independent for example any of 2-4 can be calculated in terms of 1.

In practice, the most commonly used invariants are:

I1 = trace(S) = Skk
I2 =
I3 =

8 of 12

1
(trace(S) S S) = 12 (Sii Sjj Sij Sji )
2
det(S) = 16 ijk pqr Sip Sjq Skr =ijk Si1 Sj2 Sk3

1/4/17, 12:46 PM

Continuum Mechanics - Tensors

http://www.brown.edu/Departments/Engineering/Courses/En221/No...

Eigenvalues and Eigenvectors (Principal values and direction)


Let S be a second order tensor. The scalars and unit vectors m which satisfy

S m = m

are known as the eigenvalues and eigenvectors of S, or the principal values and principal directions of S. Note that may
be complex. For a second order tensor in three dimensions, there are generally three values of and three unique unit
vectors m which satisfy this equation. Occasionally, there may be only two or one value of . If this is the case, there are
infinitely many possible vectors m that satisfy the equation. The eigenvalues of a tensor, and the components of the
eigenvectors, may be computed by finding the eigenvalues and eigenvectors of the matrix of components.
The eigenvalues of a symmetric tensor are always real, and its eigenvectors are mutually perpendicular (these two results
are important and are proved below). The eigenvalues of a skew tensor are always pure imaginary or zero.
The eigenvalues of a second order tensor are computed using the condition det(S I) = 0. This yields a cubic equation,
which can be expressed as

3 I1 2 + I2 I3 = 0

I1 = trace(S), I2 = (I12 S S)/2 I3 = det(S)

There are various ways to solve the resulting cubic equation explicitly a solution for symmetric S is given below, but the
results for a general tensor are too messy to be given here. The eigenvectors are then computed from the condition
(S I)m = 0.
The Cayley-Hamilton Theorem
Let S be a second order tensor and let I1 = trace(S), I2 = (I12 S S)/2 I3 = det(S) be the three invariants. Then

S3 I1 S2 + I2 S I3 = 0

(i.e. a tensor satisfies its characteristic equation). There is an obscure trick to show this Consider the tensor
S I (where is an arbitrary scalar), and let T be the adjoint of S I, (the adjoint is just the inverse multiplied by the
determinant) which satisfies

T(S I) = det(S I)I = (3 + I1 2 I2 + I3 )I

Assume that T= T1 + T2 + T3 . Substituting in the preceding equation shows that

T1 = I

T1 S T2 = I1 I

Use these to substitute for I1 , I2 , I3 into

T3 T2 S = I2 I

T3 S = I3 I

S3 I1 S2 + I2 S I3 = S3 (S T2 )S2 + (T3 T2 S)S T3 S = 0

3 SPECIAL TENSORS
Identity tensor The identity tensor I is the tensor such that, for any tensor S or vector v

In any basis, the identity tensor has components

Iv=vI=v
SI=IS=S
1 0 0

0 1 0

0 0 1

Symmetric Tensor A symmetric tensor S has the property

9 of 12

1/4/17, 12:46 PM

Continuum Mechanics - Tensors

http://www.brown.edu/Departments/Engineering/Courses/En221/No...

The components of a symmetric tensor have the form

S = ST

S11 S12 S13

S12 S22 S23

S13 S23 S33

so that there are only six independent components of the tensor, instead of nine. Symmetric tensors have some nice
properties:

The eigenvectors of a symmetric tensor with distinct eigenvalues are orthogonal. To see this, let u, v be two
eigenvectors, with corresponding eigenvalues u , v . Then
v [S u] = u [ST v] = u [S v] v u u = u v v (u v )u v = 0 u v = 0.

The eigenvalues of a symmetric tensor are real. To see this, suppose that , u are a complex

eigenvalue/eigenvector pair, and let ,


u denote their complex conjugates.
Then, by definition

S u = u S
u =
u. And hence
u [S u] =
u u, u [S
u] = u
u . But note that for a symmetric

tensor
u [S u] = u [ST
u] = u [S
u]. Thus
u u = u
u = .

The eigenvalues of a symmetric tensor can be computed as

k =

I1
3

3q
p
3
+ 23 cos { 13 cos1 ( 2p p)
2I13 9I1 I2 +27I3
27
1
2
I S : S)
2( 1

p = I2 13 I12

q=

I1 = trace(S)

I2 =

2(k1)
}
3

k = 1, 2, 3

I3 = det(S)

The eigenvectors can then be found by back-substitution into [S I] m = 0. To do this, note that the matrix equation
can be written as

S11
S12
S13 m1 0


S12
S22
S23 m2 = 0


S13
S23
S33 m3 0

Since the determinant of the matrix is zero, we can discard any row in the equation system and take any column over to
the right hand side. For example, if the tensor has at least one eigenvector with m3 0 then the values of m1 , m2 for this
eigenvector can be found by discarding the third row, and writing

S11
S12
m1
S13
= m3
[ S12
]
[
]
[
S22
m2
S23 ]

Spectral decomposition of a symmetric tensor Let S be a symmetric second order tensor, and let {i , ei } be the
three eigenvalues and eigenvectors of S. Then S can be expressed as

S=

i=1

i ei ei

To see this, note that S can always be expanded as a sum of 9 dyadic products of an orthogonal basis.
k m = k
S = Sij ei ej . But since ei are eigenvectors it follows that em (Sij ei ej ) ek = Smk =
{0 m k
Skew Tensor. A skew tensor S has the property
The components of a skew tensor have the form

10 of 12

ST = S

1/4/17, 12:46 PM

Continuum Mechanics - Tensors

http://www.brown.edu/Departments/Engineering/Courses/En221/No...

0
S12 S13

S12
0
S23

S13 S23 0
Every second-order skew tensor has a dual vector w that satisfies

Su=wu

for all vectors u. You can see this by noting that S12 = w3 S13 = w2 S23 = w1 and expanding out the tensor and
cross products explicitly. In index notation, we can also write

wi = 12 ijk Sjk .

Sij = ijk wk

Orthogonal Tensors An orthogonal tensor R has the property

R RT = RT R = I
R1 = RT

An orthogonal tensor must have det(S) = 1 ; a tensor with det(S) = +1 is known as a proper orthogonal tensor.
Orthogonal tensors also have some interesting and useful properties:
Orthogonal tensors map a vector onto another vector with the same length. To see this, let u be an arbitrary
vector. Then, note that |R u|2 = [R u] [R u] = u RT R u = u u = |u|2
The eigenvalues of an orthogonal tensor are 1, ei for some value of . To see this, let u be an eigenvector, with
corresponding eigenvalue . By definition, R u = u. Hence,

[R u] [R u] = u u u u = 2 u u 2 = 1. Similarly, = 1 . Since the characteristic equation is


cubic, there must be at most three eigenvalues, and at least one eigenvalue must be real.
Proper orthogonal tensors can be visualized physically as rotations. A rotation can also be represented in several other
forms besides a proper orthogonal tensor. For example

The Rodriguez representation quantifies a rotation as an angle of rotation (in radians) about some axis n
(specified by a unit vector). Given R, there are various ways to compute n and . For example, one way would
be find the eigenvalues and the real eigenvector. The real eigenvector (suitably normalized) must correspond to
n; the complex eigenvalues give ei . A faster method is to note that

trace(R) = 1 + 2 cos

Alternatively, given n and , R can be computed from

2 sin n = dual(R RT )

R = cos I + W W(1 cos ) + W sin

where W is the skew tensor that has n as its dual vector, i.e. Wij = ijk nk . In index notation, this formula is

Rij = cos ij + ni nj (1 cos ) sin ijk nk

Another useful result is the Polar Decomposition Theorem, which states that invertible second order tensors can be
expressed as a product of a symmetric tensor with an orthogonal tensor:

A=RU=VR

RRT = I

U = UT V = VT

Moreover, the tensors R, U, V are unique. To see this, note that


AT A is symmetric and has positive eigenvalues (to see that its symmetric, simply take the transpose,

and to see that the eigenvalues are positive, note that dx (AT A) dx > 0 for all vectors dx).

Let 2k and mk be the three eigenvalues and eigenvectors of AT A. Since the eigenvectors are orthogonal, we can

write AT A =

11 of 12

k=1

2k mk mk .

We can then set U =

k=1

k mk mk and define R = AU1 . U is clearly symmetric, and also U2 = AT A. To

1/4/17, 12:46 PM

Continuum Mechanics - Tensors

12 of 12

http://www.brown.edu/Departments/Engineering/Courses/En221/No...

see that R is orthogonal note that: RT R = UT AT AU1 = UT U2 U1 = I .


Given that U and R exist we can write R U = [R U RT ] R so if we define V = RURT then A = VR . It is
easy to show that V is symmetric.
To see that the decomposition is unique, suppose that A =
R
U for some other tensors
R,
U . Then
2
AT A =
U . But AT A has a unique square root so
U = U . The uniqueness of R follows immediately.

1/4/17, 12:46 PM

You might also like