EARTH
SYSTEM MODELING
ACCESS: AUSTRALIA’S
CONTRIBUTION TO THE ISERVO
INSTITUTE’S DEVELOPMENT
The new Australian Computational Earth Systems Simulator research facility provides a
virtual laboratory for studying the solid earth and its complex system behavior. The
facility’s capabilities complement those developed by overseas groups, thereby creating the
infrastructure for an international computational solid earth research virtual observatory.
S
cientific advancement requires feedback
between a predictive theory or simulation model and observation. Comprehensive simulation models that can
provide such a predictive capability of solid earth
phenomena are only now becoming feasible due
to advances in numerical simulation and parallel
supercomputer technologies. Coupled with major national programs being developed in Australia, China, Japan, and the US, these models
can provide the required computational infrastructure when combined with related observational programs.
The APEC Cooperation for Earthquake Simulation (ACES; www.aces.org.au) was established in
October 1997 under the auspices of the Asia-Pacific
Economic Cooperation by the APEC Industrial Science and Technology Working Group in recogni-
1521-9615/05/$20.00 © 2005 IEEE
Copublished by the IEEE CS and the AIP
PETER MORA, HANS MÜHLHAUS, LUTZ GROSS,
HUILIN XING, AND DION WEATHERLEY
University of Queensland, Australia
STEFFEN ABE AND SHANE LATHAM
Australian Computational Earth Systems Simulator
LOUIS MORESI
Monash University
JULY/AUGUST 2005
tion of this new opportunity and the complementary
strengths of the different national research programs. ACES aims to develop simulation models for
the complete physics of earthquakes, foster collaboration between the complementary national programs, and foster development of a research
infrastructure (supercomputers, software, and numerical methods for earth simulation). ACES brings
together observational, theoretical, and computational seismologists; computational scientists; physicists; geologists; computer scientists; and laboratory
researchers to work toward this common goal.1,2
As a follow up to this international cooperation,
ACES participants have agreed to establish the international Solid Earth Research Virtual Observatory (iSERVO), a frontier institute for solid earth
systems research (see the “iSERVO: International
Institute” sidebar for more details). iSERVO will
extensively use Web and Grid technologies to let
both researchers affiliated with the institute and external users effectively collaborate and run the simulation models and software the institute develops.
Australia’s contribution involves a new national research facility—the Australian Computational
Earth Systems Simulator (ACcESS). The new facility will consist of a high-level computational
framework, parallel software, and supercomputer
hardware for solid earth simulation. The ACcESS
simulator facility includes capabilities to model
rocks and granular systems at the particle scale, the
27
iSERVO: International Institute
S
ince the commencement of ACES, Japan has established the Earth Simulator—the world’s fastest supercomputer at the time of its installation—for simulating Earth
processes and has established a new Center of Excellence
for predicting the multiscale earth system’s evolution and
variation. Australia’s Australian Computational Earth Systems
Simulator (ACcESS) was funded with the goal of developing
numerical models, parallel software, and supercomputer
hardware to enable large-scale simulation of solid earth
processes from the microscopic to the global scale. In the
US, NASA’s Jet Propulsion Laboratory, in collaboration with
the GEM Group and national science centers, is developing
a computational infrastructure to model the physics of the
California fault system. In China, national programs involving the China Earthquake Authority’s Department of Earthquake Science, the Chinese Academy of Sciences’ Laboratory of Nonlinear Mechanics, and Peking University are
underway to study the physics of catastrophic failure in
rocks and link these to seismic observations.
As a follow up to the ACES cooperation and to take best
advantage of complementary national programs and infrastructure, ACES participants have agreed to establish a research frontier institute for solid earth system simulation
called the international Solid Earth Research Virtual Observa-
tory, or iSERVO (www.iservo.edu.au). The institute will extensively use the Web, computational Grid technologies,
and multitiered information architectures to let researchers
manipulate simulation models and data by symbolic means
in a way not previously possible. It will also involve developing a national node in each participating country linked via
Web-based services and portals to enable different numerical simulation models, simulation software, geological models, and data constraints contributed by Australia, Japan,
and the US. The models and data sets accessible within iSERVO will include several “standard” crustal fault system
models (such as strike-slip, intraplate, and subduction).
The iSERVO Grid is being constructed from Web services
to be consistent with Grid Forum standards. The system will
use distributed computing including high-performance computers and distributed heterogeneous databases. These will
be accessible through portals exploiting the new portlet standards. The institute nodes in each participating country will
have research, software, and technical staff to drive continued development of the institute’s virtual laboratory infrastructure, conduct high-impact simulation-based research of
national importance using the virtual laboratory, and interface with the broader user community including the scientific community affiliated with the institute, external research
users, government agencies, and industry users.
dynamics of crustal fault systems, and geological
processes such as fault formation, basin evolution,
mountain building, and global geodynamics. The
simulator system also has a high-level scripting language that insulates researchers from the parallel
software layer and lets researchers easily change
models without becoming experts in parallel computing. Moreover, it includes a cellular automaton
module to let researchers study the Earth’s complex
system behavior.
fluid migration resulting from pressure gradients induced by the frictional heating of pore fluids.3,6
Recognizing the magnitude of simulating all the
Earth’s processes led the center’s developers to seek
collaboration with complementary national and international efforts. This culminated in establishing
the ACES international cooperation to link international efforts and the founding of the ACcESS
national research facility. The simulation programs
in Australia that merged to form ACcESS included
Initial ACES Work
• particle-scale simulation,
• crustal fault system and seismic wave simulation,
• geological processes and geodynamics
simulation,
• surface processes simulation,
• tectonic reconstruction, and
• visualization and parallel software.
Starting in 1994, one of the authors (Peter Mora)
started to develop the Queensland University Advanced Centre for Earthquake Studies (Quakes), a
university research center, and an infrastructure in
Australia for simulating the physics of earthquakes.
The initial effort focused on developing a new particle-based simulation model termed the Lattice Solid
Model (LSM) that lets researchers study earthquake
nucleation and fault-zone processes.3–7 The model
simulated fracturing behavior, frictional processes of
rough fault surfaces, and fault gouge zones5 as well
as frictional heating and heat transfer, thermomechanical coupling effects, and the effect of pore
28
These Australian solid earth simulation efforts involve the University of Queensland, Monash University, the Victorian Partnership for Advanced
Computing, Australian National University, Melbourne University, RMIT University, and the University of Western Australia.
COMPUTING IN SCIENCE & ENGINEERING
ACcESS Development
ACcESS (see www.access.edu.au) is one of the two
new national facilities in the physical sciences that
were established under the Commonwealth government’s Major National Research Facilities
(MNRF) 2003–2007 program. Headquartered at
the Earth Systems Science Computational Centre
(www.esscc.uq.edu.au)—formerly at Quakes—at
the University of Queensland, the AUS$15 million
facility is being funded by the Australian Commonwealth government, the Queensland government, the Victorian government, a consortium of
the participant universities, two governmental
agencies (the Commonwealth Scientific and Industrial Research Organisation and Queensland
Department of Main Roads), and a leading parallel supercomputer vendor (SGI).
The ACcESS simulator consists of software systems and numerical models for multiscale, multiphysics simulations of solid earth systems and
thematic supercomputer hardware. The simulator
provides the means to simulate the dynamics of the
entire Earth, including processes ranging from mantle and crustal evolution, to crustal dynamics and
mineralization, to granular and fault-zone dynamics.
The simulator parallel software system includes
particle simulation modules, a continuum solver library (Finley) based on finite elements, and various
other modules, including a second continuum
solver package based on the particle-in-cell and
finite element methods (Snark) and a platetectonics reconstruction package (PlatyPlus). Users
control the simulator system through GUIs and a
high-level scripting interface (Escript) that lets
them easily specify and map the partial differential
equations (PDEs) and particle simulations onto the
parallel software system. A cellular automaton and
complex system module (ESyS-CA) lets researchers readily conduct phase-space studies and
summarize results on an automatically generated
Web page.
The ACcESS software enables simulation at
scales from the microscale (Figure 1), to the crustal
scale (Figure 2),8 up to the mantle and global scales
(Figure 3).
The ACcESS facility’s goal is to provide predictive
modeling capabilities to Australian researchers and
international collaborators that drive scientific advancement and breakthroughs in solid earth systems
science and technology. Examples include quantum
leaps in understanding the Earth’s evolution at global,
crustal, regional, and microscopic scales; predicting
geological processes such as tectonics and mineralization; understanding the rock failure leading to
mining technology innovations; new understanding
JULY/AUGUST 2005
Figure 1. Image showing a 3D-particle-based simulation of the fracture
and collapse of a rock structure. The rock was represented as an
assemblage of random-sized grains cemented together by elasticbrittle bonds with horizontal bands of color added at the start of the
simulation to let researchers easily visualize the fracturing. The
simulation was generated by the ACcESS LSM software module.
1.50–04
1.40–04
1.30–04
1.20–04
1.10–04
1.00–04
9.00–05
8.00–05
7.00–05
6.00–05
5.00–05
z
4.00–05
y
3.00–05
x
2.00–05
1.00–05
–2.00–11
default_Fringe:
Max 2.36–04@Nd 215228
Min 0.@Nd 432132
Figure 2. Image from a 3D simulation of the South Australian fault
system.8 The different colors represent velocity.
of hot dry rock geothermal reservoir systems; and
new knowledge of the physics of the crustal fault systems required to forecast earthquakes.
In 2003, SGI and the Earth Systems Science
Computational Centre installed the ACcESS supercomputer hardware at the University of
Queensland. It consists of a 208-processor SGI Altix 3700 parallel supercomputer with a peak per-
29
200 400 600 800
Step 0100: 216.53 MYr
Velocity magnitude
Figure 3. Image from a 3D mantle-convection simulation showing
velocity streamlines.9,10 Colors represent the magnitude of local
velocity (see legend); the Rayleigh number is Ra = 106. The normal
velocities are fixed on the boundaries and the temperatures on the top
and bottom of the model are prescribed.
formance of 1.1 Tflops (1.1 109 floating-point
[numerical] operations per second), 5 Tbytes of
disc, and 208 Gbytes of memory. The Queensland
government’s Smart State Research Facilities Fund
funded the supercomputer, and it is available for
solid earth simulation research by Australian scientists and international collaborators.
Escript: A High-Level
Computational Framework
A correct model makes computer simulation successful, but getting the model right is not easy—
particularly in geosciences. We can quickly add an
extra term into a differential equation or introduce
a new equation or effect into a numerical model on
paper. However, getting this type of change into a
computer program can be a hard and timeconsuming exercise, despite the extensive list of
available numerical libraries and environments.
Most earth scientists and solid earth system modelers are more focused on scientific questions relating to the solid earth and aren’t highly skilled at
software development. This has led many researchers to either search for analytical or numerical solutions for very simple cases—not really what
we want to do—or rely on data analysis to motivate
cartoon-like pictures to describe the earth system’s
physics and evolution. Alternatively, researchers
use whatever existing simulation capability is easy
for them to access, even if the numerical model
might have undesired limitations for the research
question being posed.
How do we overcome this? The quick answer is
to teach earth scientists and modelers how to pro-
30
gram, but it’s worth looking a bit deeper into the
way we implement numerical simulations. In fact,
a tight link exists between the numerical methods
and data representation with the actual model. A
typical example is classical finite element codes,
which keep connectivity information and physical
properties in the same data structure for each individual element. For an experienced programmer,
the advantages might be obvious—in particular,
when working with parallelized codes—but modifying the model requires modifying the source
code, which necessitates a good knowledge of the
program code.
An extra software layer is necessary to crack the
link between the model and the numerics. This is
the layer where all the model formulation—all the
mathematics—happens. As part of developing the
ACcESS facility, we’re creating a programming environment called Escript to implement this idea.9–11
Initially, the Escript language will provide researchers with high-level access to models and parallel software based on PDEs, which are the core
of most relevant models in earth sciences. The language will incorporate a prototype particle scripting language used to drive the lattice solid particle
simulation model and let researchers simulate hybrid particle–continuum systems.
We can describe the functionality of Escript’s
mathematics layer with a simple example: the timedependent temperature diffusion that is essentially
controlled by density, permeability, and the heat
source. The model is a differential equation containing spatial and time derivatives. By applying a
time-integration scheme, we end up with a linear
spatial differential equation to be solved at each
time step. The coefficients are defined by expressions of the model input parameters’ density and
permeability, the heat source, and the last time
step’s temperature. As far as the model description
is concerned, we can solve the problem without describing the representation of temperature on the
computer or how we discretize the differential
equations. These aspects of the simulation are hidden from researchers so they can focus on the
model and the problem’s basic physics rather than
computer science and software issues. The differential equation solvers plugged into Escript determine these issues without involving the user.
Escript is an extension of the Python objectoriented language, which is easy to use even for
people with no programming background. In addition, Python is lenient regarding class definitions,
so we can change the object types used to define
coefficients of differential equations without code
modifications. Researchers can change model pa-
COMPUTING IN SCIENCE & ENGINEERING
JULY/AUGUST 2005
ε (f ) = A + B (t − t j )c [ 1+ D cos( 2π
Exponent: c = 0.39
+ ψ )]
Cumulative frequency-magnitude
distribution
2
3
2
b = 0.65
1
1
b = 0.9
0
(a)
log( λ )
Log (N>M)
Cumulative Benioff strain
4
log(t j − t )
0
100 200 300 400 500 600 700
Time
(b)
3
4
5
6
Magnitude
7
Figure 4. LSM simulation results exhibiting features consistent with the
Critical-Point Hypothesis for earthquakes. (a) Cumulative Benioff strain
in a lattice solid elastodynamic simulation of a granular region
subjected to a constant normal stress and sheared at a constant rate.6
(b) Cumulative number of events larger than a given size for the first
(blue) and last half (red) of the simulated earthquake sequence in (a),
excluding the large events terminating the sequence. The distributions
are power laws (linear on the log-log plot), at least for moderate to
larger event sizes. The rollover for the smaller event sizes is believed to
be a finite model size effect.
15
10
R
rameters such as the heat source in the diffusion
script from a Python floating-point number to a
more general Escript data object. The values,
which are the result of another differential equation, can be read from an external data file. Behind
the scenes, Escript resolves the hand-over of the
coefficient to the differential equation solver, regardless of the representation chosen when the coefficient was created.
By hiding the parallel software and data representation issues from users, Escript lets us develop
highly portable simulation codes; this portability
not only refers to the computer platform, but also
to the discretization techniques and implementations being used to solve the differential equations.
We’ve successfully applied scripts to mantelconvection simulation using both a finite element
discretization and a finite difference discretization
scheme and both the OpenMP and messagepassing interface (MPI) parallel environments.
Currently, an extension of the concept of the abstraction from the numerical layer is implemented
for software components doing time-series analysis and visualization.
Escript ensures a high level of abstraction from
numerical software components, letting researchers
apply Grid technology to simulations. We can
translate the scripts into workflow distributed
across the Grid. The Escript programming environment is an important cornerstone toward developing data Grids in efficient and easy-to-use
simulations.
As a pilot project for iSERVO, Australia and the
US collaborated to develop a prototype Web portal using Escript to drive simulation codes from the
two countries. This collaboration involved the
ACcESS MNRF and the University of Queensland’s Earth Systems Science Computational Centre, the Community Grid Labs at Indiana University, and NASA’s Jet Propulsion Laboratory. In this
work, researchers at Indiana’s Community Grid
Labs and the University of Queensland’s Earth Systems Science Computational Centre developed a
prototype Web portal in which a window for Escript was used to define a problem that interfaced
to either a US finite element code developed by
JPL or an Australian finite element code developed
by ACcESS and Earth Systems Science Computational Centre to run the problem on a supercomputer at a specified location. Researchers then
visualized the results of the simulation in another
window. This kind of Web-based simulation environment using scripting to define problems and
parallel simulation software contributed by different groups minimizes duplication of effort and
5
0
20
40
60
Time
80
100
120
Figure 5. Stress field in a granular model representing a fault zone or
system (top) and stress correlation function (bottom). The lower plot
shows a growth in correlation lengths in the lead up to the catastrophic
failure at model time 66 and a sharp drop in correlation lengths when
the large event occurred. Note that the hot and cold colors depict high
and low values, respectively, and time equals 66 here is equivalent to
time equals 700 in Figure 4a).
makes simulation more accessible and effective as
a research tool for the community.
Microscale Simulation of Fault
Mechanics and Earthquake Physics
The LSM3,4,6,7 is a particle-based model for simulating discontinuous systems such as heterogeneous
rock, fault zones, fault gouge, and granular systems.
31
1e + 6
A
N clusters
N clusters
1e + 6
10,000
100
B
10,000
100
1.0
1
0.8
0.6
0.4
1e + 6
0.2
0.2
0.0
0.6
0.4
1.0
0.8
Dissipation
factor
0
1. 8
0. 6
0. 4
0. 2
0. 0
0.
0.0
N clusters
100 10,000
N failed cells
bility
Fit proba
1
100 10,000
N failed cells
D
10,000
100
Stress
transfer ratio
1
100 10,000
N failed cells
Figure 6. Phase-space plot. The fit probability of a power-law accelerating energy release prior to large events
for short-range cellular automaton (CA) models is a function of the dissipation factor and stress transfer ratio.
Two regimes are identified as being A (forecastable equals high fit probability) and D (not forecastable equals
low fit probability). A transitional regime (borderline forecastable equals moderate fit probability) separates
the two. The frequency-magnitude statistics indicate overabundance of large events in regime A, power-law
statistics in regime B (the transitional regime), and a deficit of large events in regime D.
Researchers have used it to simulate the nonlinear
dynamics of earthquakes,4,6,12 localization phenomena in fault zones,5 heat flow around faults,5
thermomechanical and thermoporous coupling effects,3,6 the micromechanics of fault gouge layers,13
and earthquake forecasting.6,14–16 A parallel software system implementing this model is being
developed by ACcESS staff within the ACcESS
software system and will be accessible to researchers
through the Escript scripting system and GUIs,
with a beta software release scheduled in 2006. Recent research using the LSM has shown evidence
for an underlying physical mechanism for earthquake forecasting compatible with the CriticalPoint Hypothesis for Earthquakes16—namely, the
hypothesis that the Earth’s crust is not perpetually
in a critical state such that a large earthquake may
occur (see Figures 4 and 5). Instead, the Earth’s
crust undergoes cycles in which smaller earthquakes
build up stress correlations and cause the crust to
approach a critical state when a large event can occur. When this happens, the crust is perturbed away
from criticality and the cycle repeats.
Figure 4 shows that lattice solid particle models
of granular layers subjected to shearing frequently
exhibit accelerating energy release and an increased
rate of moderate to large earthquakes prior to large
simulated earthquakes. Also, analysis of the stress
32
evolution in the granular region indicates that the
correlation length of the deviatoric stress grows in
the lead up to the large events that terminate the
sequence shown in Figure 5. These results are consistent with the Critical-Point Hypothesis for
Earthquakes and provide simulation-based evidence for a physical mechanism for forecasting catastrophic failure of discontinuous elastodynamic
systems and, hence, for forecasting earthquakes.
The particle simulations are computationally expensive, so it’s useful to compare the LSM results
to those obtained via the cellular automaton (CA)
models that represent simplified analogs for earthquake physics. These CA models let us conduct
many simulations with different parameters representing variables such as frictional losses. This kind
of parameter space study is called a phase-space study
and is used to explore the nature of the nonlinear
dynamics of the simplified fault system expressed
in the CA model. We can then compare the CA results to the more sophisticated LSM results to gain
insights into the nature of the more realistic
model’s nonlinear dynamics and thus the real
Earth. A CA simulation module is being developed
by ACcESS staff within the ACcESS software system that includes the ability to submit many simulations to study phase space. A beta software release
will be available in 2006.
COMPUTING IN SCIENCE & ENGINEERING
CA models representing simplified analogs for
earthquake physics also exhibit the accelerated energy release and growth of correlation lengths prior
to large events.17,18 Phase-space studies have revealed two regimes of phase space (see Figure 6).
Simulations within the first regime exhibit an overabundance of large events relative to power-law
event statistics and accelerating energy release prior
to large events—that is, the larger simulated earthquakes are forecastable. Simulations within the second regime exhibit a deficit of large events relative
to power-law event statistics and no accelerating
energy prior to large events—that is, the larger
simulated earthquakes aren’t forecastable. Powerlaw or Gutenburg-Richter event statistics are only
seen near the transition between the forecastable
and unforecastable regimes of phase space.
Recent results comparing CA and LSM models6,16–18 suggest that discontinuous elastodynamic
systems—and thus crustal fault systems—lie close
to the borderline separating a region of phase
space in which we might forecast earthquakes and
another region of phase space in which we can’t
forecast them (see Figure 6). This might explain
often contradictory observational evidence for
earthquake forecastability. If the Earth’s crust lies
near the transitional regime, slight variations in
crustal parameters might determine whether
earthquakes are forecastable in any given region.
Hence, it’s crucial to understand how the parameters of real crustal fault systems—viscosity,
fault friction, loading rate, fault density, fault
geometry, and so on—affect the system dynamics
to provide a sound scientific underpinning for
earthquake forecasting.
Macroscale Simulation
of Crustal Fault Systems
The particle and CA simulation results we’ve described so far play a vital role in improving the understanding of fault zones’ micromechanics and have
shown evidence for an underlying physical mechanism for earthquake forecasting. However, we still
might question whether the particle simulation results involving a granular layer are applicable to real
crustal fault systems. The particle simulations model
a discontinuous or granular system’s elastodynamics. When grains suddenly slide past one another,
elastic strain energy is released and converted to radiated seismic waves in the model. Such rupture
events represent simulated earthquakes in the
model, but a key difference between such ruptures
and the ruptures in an earthquake fault system is that
the model’s internal geometry changes as grains
move. However, in real fault systems, there is no sig-
JULY/AUGUST 2005
(a)
(b)
Figure 7. The South Australian fault system model. (a) Image of the
region of South Australia being discretized and (b) 3D fault model of
the South Australian fault system.
Figure 8. A finite element discretization of the South Australian fault
system. The solid lines depict the random triangular mesh used to
discretize the region of South Australia being modeled.
nificant geometric change to the fault system as a result of a single slip event. For this reason, an ability
to simulate crustal fault systems more realistically is
vital for studies of the physics of earthquakes and interacting fault systems.
To solve this problem, we’re developing simulation capabilities to model the dynamics of
crustal fault systems based on the finite element
method.19 In this approach, a crustal fault system
is discretized onto a finite element mesh with
faults represented as discontinuities between volumes of material. Figures 7 and 8 show an example of such a discretization for the South
Australian fault system,8 and Figure 9 shows a
snapshot of the equivalent stress-rate distribution
in a simulation. In these simulations, and others
modeling the San Andreas California fault system,20 we’ve seen evidence for clustering earthquake activity and stress transfer effects on
earthquake activity. This supports the need to
model the system dynamics as a whole to better
study the earthquake forecasting question.
We’re currently integrating this capability into
33
5.00–02
4.33–02
3.67–02
3.00–02
2.33–02
1.67–02
1.00–02
3.33–03
–3.33–03
–1.00–02
–1.67–02
z
–2.33–02
y
–3.00–02
x
–3.67–02
–4.33–02
–5.00–02
default_Fringe:
Max 2.44+01@Nd 215228
Min –4.11+00@Nd 312128
Figure 9. A snapshot from a simulation of the South Australian fault
system showing the equivalent stress-rate distribution. Hot colors
represent positive values of equivalent stress rate (increasing stress),
and cold colors represent negative values of equivalent stress rate
(decreasing stress).
specifies the PDEs. A fully 3D prototype module
has also been developed that interfaces to the Escript framework but in which the PDEs themselves
are not specified by the scripting language. These
capabilities will be available in the 2006 beta software release.
Our simulation framework will ultimately be set
up to enable simulations of the elastodynamics of
interacting fault systems, including heat transfer,
thermal coupling effects, viscoelastic effects, and
pore fluid flow induced by stress fluctuations.
These capabilities—which we anticipate will be
available in 2007—will provide a powerful tool to
study complex crustal system behavior such as the
physics of interacting fault systems and geothermal
reservoirs. This tool provides a means to achieve
breakthroughs in understanding leading to an improved ability to forecast earthquakes, improved
understanding of crustal evolution and mineralization processes, and exploitation of new green
energy resources such as hot fractured rock geothermal energy.
Simulating Geological
Processes and Mantle Convection
Figure 10. Snapshot of a simulation using the particle-in-cell method to
model formation of thrust faults. A system consisting of a viscoplastic
upper layer; a nonlinear, ductile middle layer; and a highly viscous lower
layer was initially extended. Sediments were deposited in the faultbounded basins, which developed spontaneously. On overall system
compression, the basin-bounding faults were reactivated as thrusts.
the high-level Escript computational environment
and developing visualization and analysis tools to
probe the earthquake question. In our approach,
we’ll model stress buildup using a quasistatic approach until an earthquake nucleates in the model,
at which time we’ll switch over to a fully dynamic
computational approach to simulate dynamic rupture and the associated stress redistribution. The
analysis tools will provide earthquake hypocenters
and magnitudes, as well as stress correlation functions, to let researchers study the physics of interacting fault systems and compare the simulated
observables to field observations. Presently, a 2D
prototype module has been developed within the
Escript framework in which the scripting language
34
Studying long-time-scale geological processes such
as mountain building, fault formation, and basin
evolution requires a different kind of computational
machinery that can simulate large deformations.
For this purpose, we’re developing computational
models based on the particle-in-cell method.21,22
Figures 10 and 11 show example 2D simulations to
illustrate this method’s capability.23,24
Figure 10 depicts a snapshot of layers that were
first extended and then compressed, leading to the
formation of thrust faults; Figure 11 shows a snapshot of a salt dome in which a lighter salt lens
pushes upward and deforms overlying layers, resulting in fault displacements. Such simulations help
bridge the gap between long-lived mantle processes
and the time scale of fault interactions in the brittle
crust. One of the most fundamental questions in geodynamics is the manner in which strains derived
from relative plate motions are manifested in the
brittle upper part of the crust. We can examine the
partitioning of the strain between the continuum
deformation between faults and slip on the faults
themselves in the context of the rheological layering in the lithosphere as well as the feedback between surface deformation patterns and the response of deep mantle flow. Related to this topic is
the question of the relative deformation of crust and
mantle—when do we expect to see delamination
and the spontaneous development of 3D structure,
and what is the role of lateral strength variations in
COMPUTING IN SCIENCE & ENGINEERING
the crust/mantle lithosphere in producing 3D structure from a 2D driving force?24
Deep earth processes such as mantle convection
are at the heart of understanding how the Earth
works, including the evolution of its topography.
However, these processes remain at best poorly
understood because of the substantial technical
challenges associated with modeling mantle
convection, plate tectonics, and the topography’s
associated evolution. Mantle convection is characterized by strongly variable (that is, stress-,
temperature-, and pressure-dependent) viscosities.
The lithosphere exhibits the critical processes such
as fracture and shear-zone deformation (strain localization) that are physically distinct from the viscous flow deeper in the mantle, and they occur on
fundamentally different (smaller) length scales. Researchers have barely touched the feedback of free
surface effects and the influence of topography. In
addition, the mantle is chemically heterogeneous,
is replete with silicate melts and volatiles, and has
numerous pressure- and temperature-induced
structural (phase) changes that affect its dynamics.
ACcESS is developing a suite of software that is
uniquely suited to tackle the computational simulation of the interplay of free surface effects (topography), plate tectonics, and deep earth processes
such as mantle convection and magmatism.23,25
Modeling of mantle convection also provides a
means to study mineralization processes.26 However, from the mathematical nature of the underlying models, the response of the simulations will in
many cases depend on the often poorly constrained
initial conditions. Hence, the mantle convection
models must be more closely integrated into the
real world through direct data assimilation and directly tested against various observations.
Global-scale simulation of mantle convection is
necessary for studying Earth’s evolution. Figure 12
shows a simulation using the Terra software.27–30
We’re integrating this simulation capability into the
ACcESS computational system to make global-scale
simulation more accessible to researchers. Ultimately, researchers can use tectonic reconstructions
(see Figure 13) as the surface motion constraints for
the global models. This kind of integration of geological observations to constrain past surface displacements is important because these constraints
influence internal flow patterns. The global-simulation capability provides a powerful means to answer
fundamental questions about the Earth’s evolution
and dynamics and to gain insights into plume and
mantle dynamics. This has both scientific significance and economic implications in terms of exploring for massive new mineralization deposits.
JULY/AUGUST 2005
Figure 11. Snapshot of a simulation using the particle-in-cell method to
model the formation of a salt dome and resultant deformation and
faulting. In the simulation, a low-density lens representing a buried salt
layer underneath highly viscous rock layers with prescribed weak zones
(representing faults) shown in blue is pushed up into the overlying
layers, deforming the layers and displacing the faults.
3.00 Ga bp
Figure 12. Snapshot after 1.49 gigayears (Gyr) of 4.49 Gyr of a mantleconvection simulation using the Terra software.30 Pristine material is
white, and dark colors represent material that has been altered near the
surface—for example, by partial melting or degassing. The aim of this
kind of model is to reconcile geophysical and geochemical constraints
on the evolution of Earth’s mantle.
S
olid earth systems simulation is now
becoming feasible from the microscopic to the global scale. The ACES
international cooperation has shown
development of simulation capabilities for solid
earth phenomena that are beyond the ability of
a single group or country. Each country has different strengths, computational approaches,
and laboratory and field observational systems.
This range of numerical models is required to
35
55 million years ago
Government Smart State Research Facility Fund and SGI
funded the ACcESS supercomputer.
References
1. A. Donnellan et al., Computational Earthquake Science, Part I,
Birkhäuser, 2004.
2. A. Donnellan et al., Computational Earthquake Science, Part II,
Birkhäuser, 2004.
3. S. Abe, P. Mora, and D. Place, “Extension of the Lattice Solid
Model to Incorporate Temperature Related Effects,” Pure and Applied Geophysics, vol. 157, nos. 11 and 12, 2000, pp. 1867–1887.
4. P. Mora and D. Place, “Simulation of the Frictional Stick-Slip Instability,” Pure and Applied Geophysics, vol. 143, nos. 1–3, 1994,
pp. 61–87.
5. P. Mora and D. Place, “The Weakness of Earthquake Faults,” Geophysical Research Letters, vol. 26, no. 1, 1999, pp. 123–126.
6. P. Mora et al., “Lattice Solid Simulation of the Physics of Earthquakes: The Model, Results and Directions,” GeoComplexity and
the Physics of Earthquakes, J.B. Rundle, D.L. Turcotte, and W.
Klein, eds., Am. Geophysical Union, 2000, pp. 105–125.
7. D. Place and P. Mora, “A Lattice Solid Model to Simulate the
Physics of Rocks and Earthquakes: Incorporation of Friction,” J.
Computational Physics, vol. 150, no. 2, 1999, pp. 1–41.
8. H.L. Xing and P. Mora, “Construction of an Intraplate Fault System Model of South Australia and Simulation Tool for the iSERVO
Institute Seed Project,” submitted to Pure and Applied Geophysics,
2005; www.aces-workshop-2004.ac.cn/html/ext.htm.
Figure 13. Snapshot from a plate-tectonic reconstruction using the
PlatyPlus software developed by Gordon Lister of the Australian
National University and his group at Monash University. Such
geological-observation-based surface movements can help
researchers constrain surface-boundary conditions in global mantleconvection simulations.
model the entire earth system, and calibration
is required to ensure that results obtained using
these models match with the different laboratory and field observations available in each
country. For this reason, international groups
have agreed to work toward establishing iSERVO to build the ACES cooperation. iSERVO aims to collaboratively develop a computational infrastructure—accessible through
Web portals—that combines models developed
across the international community and to conduct collaborative research to solve problems of
global significance such as earthquake forecasting, green energy development, and environmental management.
Acknowledgments
We gratefully acknowledge support from the Australian
Computational Earth Systems Simulator national
research facility, the Commonwealth of Australia MNRF
program, the Australian Research Council, the
Queensland State Government, the University of
Queensland, and SGI. The Queensland State
36
9. M. Davies, L. Gross, and H. Mühlhaus, “Scripting High Performance Earth Systems Simulations on the SGI Altix 3700,” Proc.
7th Int’l Conf. High-Performance and Grid Computing in the Asia
Pacific Region, IEEE CS Press, 2004, pp. 244–251.
10. M. Davies, H.B. Mühlhaus, and L. Gross, “The Rapid Development of High Performance Numerical Models in Mantle Convection,” Proc. 4th ACES Workshop, 2004; www.aces-workshop2004.ac.cn/html/ext.htm.
11. L. Gross, M. Davies, and J. Gerschwitz, “A High-Level Programming Language for Modelling the Earth,” Proc. 4th ACES Workshop, 2004; www.aces-workshop-2004.ac.cn/html/ext.htm.
12. S. Abe, S. Latham, and P. Mora, “Simulation of the Dynamic Rupture of a Rough Planar Fault in 3D using the Lattice Solid Model,”
Proc. 4th ACES Workshop, 2004; www.aces-workshop2004.ac.cn/html/ext.htm.
13. S. Abe and K. Mair, “Grain Fracture in 3D Numerical Simulations
of Granular Shear,” Geophysical Research Letters, vol. 32, no. 5,
article no. LO53052005.
14. P. Mora et al., “Simulation of the Load-Unload Response Ratio
and Critical Sensitivity in the Lattice Solid Model,” Pure and Applied Geophysics, vol. 159, no. 10, 2002, pp. 2525–2536.
15. Y.C. Wang et al., “Statistical Tests of Load-Unload Response Ratio (LURR) Signals Using the Lattice Solid Model (LSM): Implication to Tidal Triggering and Earthquake Prediction,” Pure and
Applied Geophysics, vol. 161, nos. 9 and 10, 2004, pp.
1829–1839.
16. P. Mora and D. Place, “Stress Correlation Function Evolution in
lattice Solid Elasto-Dynamic Models of Shear and Fracture Zones
and Earthquake Prediction,” Pure and Applied Geophysics, vol.
159, no. 10, 2002, pp. 2413–2427.
17. D. Weatherley, P. Mora, and M. Xia, “Long-Range Automaton
Models of Earthquakes: Power Law Accelerations, Correlation
Evolution and Mode Switching,” Pure and Applied Geophysics, vol.
159, no. 10, 2002, pp. 2469–2490.
18. D. Weatherley and P. Mora, “Accelerating Precursory Activity
within a Class of Earthquake Analogue Automata,” Pure and Applied Geophysics, vol. 161, nos. 9 and 10, 2004, pp. 2005–2019.
19. H.L. Xing, P. Mora, and A. Makinouchi, “Finite Element Analysis
COMPUTING IN SCIENCE & ENGINEERING
of Fault Bend Influence on Stick-Slip Instability along an IntraPlate Fault,” Pure and Applied Geophysics, vol. 161, nos. 9 and 10,
2004, pp. 2091–2102.
20. H.L. Xing and P. Mora, “Finite Element Modeling of Interacting
Fault Systems,” Proc. 4th ACES Workshop Abstracts, 2004, pp.
112–114; www.aces-workshop-2004.ac.cn/html/ext.htm.
21. L. Moresi, F. Dufour, and H.B. Mühlhaus, “A Lagrangian Integration Point Finite Element Method for Large Deformation
Modelling of Viscoelastic Geomaterials,” J. Computational Physics,
vol. 184, no. 2, 2003, pp. 476–497.
22. D. Sulsky, Z. Chen, and H.L. Schreyer, “A Particle Method for History-Dependent Materials,” Computer Methods in Applied Mechanics and Eng., vol. 118, nos. 1 and 2, 1994, pp. 179–196.
23. L. Moresi, F. Dufour, and H.B. Mühlhaus, “Mantle Convection
Modelling with Viscoelastic/Brittle Lithosphere: Numerical
Methodology and Plate Tectonic Modeling,” Pure and Applied
Geophysics, vol. 159, no. 10, 2002, pp. 2335–2356.
24. H.B. Mühlhaus et al., “Large Amplitude Folding in Finely Layered
Viscoelastic Rock Structures,” Pure and Applied Geophysics, vol.
159, no. 10, 2002, pp. 2311–2333.
25. H.B. Mühlhaus, L. Moresi, and M. Cada, “Emergent Anisotropy
and Flow Alignment in Viscous Rock,” Pure and Applied Geophysics, vol. 161, nos. 11 and 12, 2004, pp. 2451–2463.
26. C. Zhao et al., “Finite Element Modelling of Three-Dimensional
Steady State Convection and Lead/Zinc Mineralisation in Fluid
Saturated Rocks,” J. Computational Methods Science Eng., vol. 3,
no. 1, 2003, pp. 73–89.
27. J.R. Baumgardner, “Application of Supercomputers to 3-D Mantle Convection,” The Physics of the Planets, S.K. Runcorn, ed.,
John Wiley & Sons, 1988, pp. 199–231.
28. J.R. Baumgardner, “Three-Dimensional Treatment of Convective
Flow in the Earth’s Mantle,” J. Statistical Physics, vol. 39, nos. 5
and 6, 1985, pp. 501–511.
29. H.-P. Bunge and J.R. Baumgardner, “Mantle Convection Modeling on Parallel Virtual Machines,” Computers in Physics, vol. 9, no.
2, 1995, pp. 207–215.
30. K.-D.Gottschaldt, Vermischung in 3D sphärischen Konvektionsmodellen des Erdmantels [Mixing in 3D Spherical Models of the
Earth Mantle], doctoral dissertation, Chemisch-Geowissenschaftliche Fakultät, Friedrich-Schiller-Universität, 2003.
Peter Mora is a professor at the Earth Systems Science
Computational Centre at the University of Queensland,
Australia. His technical interests include earthquake
physics, computational earthquake science, and computational geophysics. Mora has a PhD in geophysics
from Stanford University. He is a member of the American Geophysical Union, the American Association for
the Advancement of Science, and the Society of Exploration Geophysicists. Contact him at director@
esscc.uq.edu.au.
Hans Mühlhaus is a professor at the Earth Systems
Science Computational Centre at the University of
Queensland, Australia, and an adjunct professor at the
University of Western Australia. His technical interests
include computational geodynamics, generalized continuum theories, and rheologies. Mühlhaus has a PhD
in civil engineering from the University of Karlsruhe. He
is a member of the Japanese Geotechnical Society for
Soils and Foundations and the Australian Academy of
JULY/AUGUST 2005
Science Theoretical and Applied Mechanics Group.
Contact him at muhlhaus@esscc.uq.edu.au.
Lutz Gross is an associate professor at the Earth Systems Science Computational Centre at the University
of Queensland, Australia. His technical interests include
numerical methods for particle differential equations,
scientific software, and high-performance computing.
Gross has a PhD in mathematics from the University of
Karlsruhe. He is a member of the Australian Mathematical Society. Contact him at l.gross@uq.edu.au.
Louis Moresi is an associate professor in the School of
Mathematical Sciences at Monash University, Australia.
His technical interests include planetary dynamics,
plate tectonics, computational geophysics, and particle-based finite element methods. Moresi has a DPhil
in geophysics from Oxford University. He is a member
of the American Geophysical Union, the Royal Astronomical Society, and the Geological Society of Australia. Contact him at louis.moresi@sci.monash.edu.
Huilin Xing is a senior research fellow at the Earth Systems Science Computational Centre at the University
of Queensland, Australia. His technical interests include
computational mechanics and crustal dynamics, material science, and engineering. Xing has a PhD in
material science and engineering. He is a member of
the International Association of Computational Mechanics. Contact him at xing@esscc.uq.edu.au.
Dion Weatherley is a computational scientist at the
Earth Systems Science Computational Centre at the
University of Queensland, Australia. His technical interests include earthquake physics and computational
geophysics. Weatherley has a PhD in geophysics from
the University of Queensland. He is a member of the
American Geophysical Union. Contact him at dion@
esscc.uq.edu.au.
Steffen Abe is a computational scientist at the Australian Computational Earth Systems Simulator. His
technical interests include computational geophysics,
discrete-element modeling, fault micromechanics, and
earthquake rupture processes. Abe has a PhD in geophysics from the University of Queensland. Contact
him at steffen@esscc.uq.edu.au.
Shane Latham is a computational scientist at the Australian Computational Earth Systems Simulator. His
technical interests include parallel computation, scientific software development, and discrete-element
modeling. Latham has a PhD in mathematics from the
Australian National University. Contact him at
slatham@access.edu.au.
37