INFORMATION TECHNOLOGY APPLIED TO ENGINEERING
GEOLOGY
Niek Rengers1, Robert Hack1, Marco Huisman1, Siefko Slob1
and Wolter Zigterman1
This keynote paper describes the application of Information Technology to various
ABSTRACT:
aspects of Engineering Geology, from data requirements, data handling and processing to numerical and GIS
modelling and visualization. It is illustrated with a number of practical examples developed in The
Netherlands.
Cet article décrit l’application de technologie de l’information à des aspects
RESUME:
d’ingénierie géologique, depuis la demande des données, l’arrangement et le traitement des données, jusqu'à
la modélisation numérique et la modélisation en SIG, ainsi que la présentation des résultats. L‘article est
illustré à l’aide de quelques exemples pratiques développés aux Pays Bas.
INTRODUCTION
In this keynote the following definitions are used for the key terminology:
Information Technology refers to methods of handling information by automatic means, including
computing, telecommunications, and office systems. It deals with a wide range of mostly electronic devices
for collecting, storing, manipulating, communicating and displaying information.
Knowledge is pragmatically defined in the context of IT as “justified true belief; as a shared human
response to a complex environment”.
Information can be defined as something that adds to one’s knowledge and understanding, and includes
data (known facts) and their interpretation.
The book of Loudon (Loudon, 2000), from which these definitions were taken, is a good introduction to
the topic of IT in the Geosciences. (Figure 1) describes the relation between the elements in information
technology.
IT offers opportunities for efficient handling of engineering geological information. The two most
significant aspects of IT for technical geosciences are (Loudon, 2000):
The obvious ability of computers to calculate, thus opening up possibilities of more rigorous analysis with
quantitative and statistical methods, and more vivid graphical presentation with visualization techniques.
The manipulation and management of information; this ranges from the ability to move elements of a
picture in a graphics system to the ability to capture data, store vast quantities of information in a database
management system, and retrieve specific items on demand.
IT influences the way in which engineering geologists investigate the real world, how they communicate,
what they know and how they think. This can lead to better science, cost savings, increased speed and
flexibility in data collection, storage, retrieval, analysis, communication, and presentation.
Since the spectacular increase of calculation power in the 1960’s and 1970’s through the development of
computers, engineering geologists and geotechnical engineers started the development of digital
methodologies to support the effective and efficient execution of their work. Initially this development
focused on the application of the enhanced data handling capacity and calculation velocity to introduce more
detail in conventional methods of slope stability calculation, the calculation of the behaviour of soil and rock
around tunnel openings and in foundations and earthworks to obtain more refined results.
1
International Institute for Geoinformation Science and Earth Observation (ITC), Division Engineering Geology,
Kanaalweg 3, 2628 EB Delft, The Netherlands
Engineering Geology for Developing Countries - Proceedings of 9th Congress of the International Association for Engineering Geology and the
Environment. Durban, South Africa, 16 - 20 September 2002 - J. L. van Rooy and C. A. Jermy, editors
Géologie de l'Ingénieur dans les Pays en voie de Développement - Comptes-rendus du 9ème Congrès de L'Association Internationale de Géologie de
l'Ingénieur et de l'Environnement. Durban, Afrique du Sud, 16 - 20 septembre 2002 - J. L. van Rooy and C. A. Jermy, éditeurs
ISBN No. 0-620-28559-1
121
Soon, in the seventies and eighties, however, the further growth in calculation speed and in the handling
capacity of large data volumes (as well as the increased accessibility of calculation power in the form of
PC’s) enabled the development of new methods of geotechnical analysis (finite element, finite difference,
distinct element, and particle flow methods), allowing for models of irregular forms with the third
dimension, consisting of materials with more complicated stress-strain relationships (including strain
softening and the like). The fourth dimension was introduced where progressive stages of excavation and/or
loading, and or time dependent deformation processes introduced time as an important factor into the
analysis.
Other important developments that
became possible with the ever
increasing calculation speed and data
wisdom
contemplation
handling capacity at relatively low cost
levels are in the field of database
systems for the storage and retrieval of
geotechnical surface and borehole data
knowledge
reasoning
and in the field of 2D, 3D and 4D geoinformation systems and related
visualization techniques.
Geo-Information
processing & mining
information
processing & mining
Working with IT does not only mean
the transformation of data and
information into a digital format and
data
manipulating those data and information
acquisition
in enormous amounts in ever more
complicated algorithms. The "digital
revolution" has allowed for important
Figure 1. Relation between data, information and knowledge
steps forward using ways of thinking
(Kainz, 2002)
that existed in the analogue era, but if
we want to use the potentials of IT for the full extent in Engineering Geology we will also have to adapt our
"way of thinking". In fact this means changing the methodological approaches that we were using to solve a
problem. Traditionally we start with a vague idea on how to solve a problem. During the process of thinking
we enlarge and couple this vague idea to the knowledge available in our brains, and to knowledge from
external sources, and link the data and processes (we) think to be important to the ‘idea’. We do generally
not know how we select data and processes from our brains and how and based on what we link data and
processes. To be able to use IT to the full extent we will have to set out the methodology for the analysis of
our problems in advance and in detail. We cannot use the same methodology as we use in our brains because
we do not know how that works. Hence, we have to re-define the methodology for the ‘IT- thinking process’.
In the context of this keynote lecture it is not possible to describe the stage of development of IT
applications in Engineering Geology worldwide. For this reason this paper is restricted to a general
discussion on the developments in a number of the most important fields and is illustrated with examples of
work done in those application fields in research institutes, universities and industry in The Netherlands. This
choice is not based on the assumption that in this country the developments are ahead of those in other
countries, but it is meant to illustrate that even in a small country many developments are taking place and
that IT has been used extensively in various aspects of engineering geology. We are convinced that it would
equally well be possible to present examples of developments of equal or greater interest in many other
countries in the world.
The following aspects will be treated in this keynote:
Standardisation of engineering geological input data for use in IT
The (2D-, 3D-, 4D-) spatial model describing the engineering geological environment
Databases for large engineering geological data sets
Geotechnical (numerical) modelling
Spatial data integration and 2D modelling with GIS
Visualisation techniques
Public data exchange, data quality assurance and data accessibility
Access to newly developed knowledge
Technology
122
STANDARDISATION OF ENGINEERING GEOLOGICAL INPUT DATA FOR USE IN IT
The digital handling of engineering geological data leads to the great advantage that vast amounts of such
data can be combined and analysed, as well as imported into and exported from databases and/or large
(virtual) spatial models in fractions of seconds. In the pre-digital era this was not possible at all due to the
time consuming character of such operations.
The biggest potential of IT is the possibility to process and analyse large amounts of primary data, such as
in geophysical surveying, remote sensing and in cone penetration testing. Processes such as anti-aliasing,
improvement of signal to noise ratios and, generally, processing of large data sets would not have been
possible without the data handling capacity of IT.
Digital data handling, however, forces us also to adopt strict rules on the uniformity of data formats and
to an appropriate choice of parameters to be described. The uniformity of the data formats is essential for the
“interoperability” of data on different parameters and from different data sources. However, the use of large
databases and the importation of datasets into programs for numerical analysis also require uniformity in the
way to structure the description of the engineering geological parameters.
There exists a strong tendency for specialists, even within one organization, to work independently with
newly emerging technologies as IT, selecting their own computing tools and their own structure and format
for storing data. If there is no effort for standardisation then this may lead to users spending more time on
transforming data than on solving geotechnical problems. Sharing data between organizations will add
further complexity.
International associations as IAEG, ISRM, and ISSMGE have a responsibility to contribute to worldwide
standardisation of data formats, description- and classification procedures. Only the implementation of such
standards will allow better links with software packages and through data-networks such as the Internet.
A cross-section of oil companies addressed this issue by creating the (non-profit) Petrotechnical Open
Software Corporation (POSC). "The standards and open systems environment to be facilitated by POSC
represent a maturing of the industry that frees companies from worrying about the integration of their
computer systems and lets them concentrate on areas of added value" (POSC, 1993).
This section will not focus on the aspect of data formats for the use of engineering geological data, but on
the role that IT can play, because of its strong data handling capacity, in the process of the collection and
processing of primary data:
• Digital analysis of SCPT data
• Digital description of borehole logs
• Digital field book for the collection of (engineering) geological data
• Digital outcrop mapping
Earth observation with Remote Sensing (RS) delivers large amounts of data on the earth surface.
Automatic production of high resolution Digital Elevation Models (DEM) and monitoring of the changes in
time of landform (pattern) and spectral reflection characteristics (related with vegetation and lithology) can
only be achieved through digital image processing with computers. This aspect of primary data collection,
although potentially very important for engineering geological applications, will not be treated here. For
references to specialized literature on these aspects of remote sensing and image processing please refer to
the ITC textbook on this topic (ITC, 2001).
Digital analysis of SCPT data
Cone penetration testing is very frequently used in The Netherlands to determine the vertical succession
of different layers of soft soil and to give data on a number of the parameters of these layers. Huijzer (1992)
worked first on the automatic interpretation of cone penetration tests, but a good automatic recognition of
layers based exclusively on cone resistance turned out to be impossible. In many cases non-existing layers
appeared, where an experienced engineer would have drawn a gradual transition between two really different
layers.
A later study by Coerts (1996) started the interpretation from a pre-defined number of geotechnical units.
This number was established from boreholes, or it was based on profiles published with the geological maps.
This method of working made it possible to follow fairly accurately the boundaries between the different
layers at locations between the boreholes.
123
Digital description of borehole logs
For input in databases and GIS digital data of boreholes are needed. The Netherlands Research Center for
Civil Engineering (CUR) Committee E18 has produced the GEF-BORE-Report in 2002 (CUR, 2002). The
standard digital borehole description is designed to be connectable to existing software in a large variety of
organisations via a simple interface. The data that are included are of course layer depth, and a detailed soil
classification, according to NEN 5104, the standard for geotechnical soil description used in The
Netherlands.
Additional information will be
stored, like the method of execution of
the borehole, contamination data,
information about piezometers that were
placed, details about the method of
backfilling, (weather)-conditions during
the execution, the name of the
drillmaster and/or the laboratory
technician
who
did
the
soil
classification, the moisture condition
during the classification.
The system is structured according to
the Parent/Child principle. The Child
files contain the secondary data, which
are not important for all users. The
design is such is that in the future also
the data collected at the borehole later
during the execution stage of the work
can be stored in GEF-format. This can
be the case for early measurements of
Figure 2. Automated field data acquisition
settlements which can be valuable for
the planning of maintenance work in the
future of embankments, or housing areas on very soft soil, where the ground level has to be raised by filling.
Digital field book for the collection of (engineering) geological data
In many engineering geological investigations an important part of the data on the earth’s surface are
collected during field visits. Traditionally the data acquired during such field visits are noted down in field
books and sometimes photographs are made to support the observations. While only a fraction of all field
observations is actually captured as data in field books, it is usually very difficult to reconstruct where
exactly the observations where made. Thus real data that are fit to be entered into a database are very often
lost forever after a project is finished.
At the ITC since 1974 concepts of field data capture methodologies have been developed and
implemented in geological field mapping exercises with students. These exercises started with the
application of the Field log, developed at the Ontario Geological Survey in Canada (Brodaric and Fyon,
1989) on laptop computers. In 2001 personal digital assistants (PDA’s) were introduced (Woldai and
Schetselaar, 2002), for which palmtop computers were equipped with RDBMS (Relational Database
Management Systems) software for direct data acquisition in the field. Incorporation of a Global Positioning
System (GPS) enables automatic reception of the field coordinates and stores this information in the digital
field book as well.
The acquired data are daily synchronized to laptop GIS systems at the base camp (Figure 2). As the
database is populated at the source, and intimately used during the field mapping, the digital data represents
the most accurate data repository of the field survey and thus is ideal for integration into larger corporate GIS
databases, without requiring duplicate data entry or error-prone data transcription sessions in the office.
Obviously, if applied to engineering geological field studies, the database for the data to be acquired in
the field has to be specially designed for this purpose. An example of a specific engineering geological
database is given in part section 4.2.
124
Three-dimensional digital outcrop mapping
Traditionally samples as well as outcrops of soil and rock have been photographed to give a visual
impression of the site conditions in engineering geological reports and archives. Terrestrial stereophotography has been used to record the exact form of outcrops in a visual 3D model. Measurements of
location coordinates of points in the outcrops, but also of orientations of planes can be extracted from the
stereo-photo model using photogrammetrical methods.
The latest method in this field, that has become available through the increased data handling capacity of
computers, is the use of lidar. This is a laser-based remote sensing technique that is applied in air-borne as
well as in ground-based surveys. With ground-based laser scanning, the geometry of virtually any object can
be measured in great detail and accuracy. The location of each point on the surface of the object in 3D space
is calculated by determining the “time of flight” of the reflected laser beam, which is proportional to the
distance from the scanner. Combined with the directional parameters of the scanner (azimuth and angle) this
will give the location relative to the scanner’s position.
Through the scanning of rock surfaces this technique generates high-resolution 3D geometrical
information of natural outcrops or excavations in rock. Depending on the distance from the scanner to the
object, scanning resolutions can be reached in the order of centimetres to millimetres. The subject of the
research currently being undertaken by ITC (Slob et. al. 2002), is to obtain discontinuity information (joints,
bedding planes, fractures) from the point cloud data set, and to determine statistically the orientation of joint
sets. For this purpose, a laser scan data set of a rock outcrop is used where the discontinuity sets are clearly
recognisable (Figure 3).
As a 3D triangulated surface, the
scanned rock face is represented by a
large number of triangles. The
orientation and surface area of each
triangle is computed using simple
geometrical rules. Subsequently, the
orientations of all triangles are plotted in
a stereo net, and contoured in terms of
density. From the density plot the
orientation of the different discontinuity
sets can be identified without difficulty.
If this approach can be further
developed and fully automated, this
would give the site engineer or
geologist, in real-time, evidence on the
internal structure of a discontinuous
rock mass. Particularly in areas where
Figure 3. Point cloud data set of a rock surface, generated with a
there is difficult access to rock
laser scanner
exposures, where visibility is poor
and/or measurements have to be done rapidly, application of this technique is very promising. Fields of
application may include tunnelling, quarrying and mining.
3. THE (2D-, 3D-, 4D-) SPATIAL MODEL DESCRIBING THE ENGINEERING GEOLOGICAL
ENVIRONMENT
What started already decades ago in the mining industry is now becoming everyday practice in larger
civil engineering projects. All information on the spatial model and on the geotechnical characteristics of the
ground layers below and around civil engineering works such as tunnels and deep excavations for
foundations is introduced into a 3D digital ground model (3DGM).
Recently developed digital ground models allow for a number of special data handling facilities such as
automatic definition of (sharp or fuzzy) boundaries, interactive input of true and simulated (virtual) data to
work out a variety of scenarios, and most importantly powerful visualization tools and possibilities to export
layer configurations and their geotechnical parameters into a geomechanical model.
125
However, it is good to be aware that the virtual model starts to play an important role by itself, and one of
the major questions now is "how likely or how reliable" the model is. In the past with a "hand-drawn" model
nobody asked that question because it was clear that it was unanswerable. However, in a digital model an
answer is expected. To be able to answer the question we touch various very basic aspects. It starts with what
is data, why do we have the data, and why have they been acquired. All these aspects influence the accuracy
of the data itself, their reliability, but will also influence how the data are to be linked and modelled. Such
problems are far from being solved and will need much attention.
North-South tunnel project Amsterdam
A large urban infrastructure project such as the construction of a new metro line below the city of
Amsterdam, generates large amounts of data. Proper information management is one of the key elements for
the success of such a project. Witteveen+Bos Consulting Engineers has developed for this project a 3DGM
in a GIS (ArcInfo). The main engineering geological content of this 3D ground model is the threedimensional layer model, and the geotechnical parameters of each individual layer.
All the site investigation data can be easily stored and extracted through user-friendly interfaces. One of
the most important site investigation tools in the Dutch soft ground is the Dutch Cone Penetration Test.
These DCPT records are ideal to store in a digital data system (see above in section 2). The 3DGM
automatically creates a three-dimensional layer model of the subsurface. Automatically, a geotechnical
profile can be drawn from this 3D layer model, along any given section. Apart from the visualisation of
layers in the underground, 3DGM is used for a number of other GIS-based analyses, calculations, and data
handling operations:
• Contour maps of the thickness, top or base of layers
• Volume calculations, quantifying the amount of excavated ground, subdivided per soil type and
excavation phase
• Vertical stability calculations of the deep excavation floor
• Analyses of the sensitivity of the available site investigation data or the necessity for additional
site investigation
• Export of automatically generated geotechnical profiles to geotechnical numerical model
calculations
An important example where this three-dimensional character of the information is important is the
design of station Rokin, because the model shows that at this site there is a large variation in the subsurface.
Consequently, it was decided to model the whole station in a Finite element Model and not just half of the
station as is common in this type of calculations (Figure 4).
A special feature of the described
ground model is that it is connected with
the Integrated Boring Control System, a
system that assists the Tunnel Boring
Machine pilot in causing a minimum of
disturbance to the surroundings. It can
be understood as a simplified form of an
automatic pilot for the TBM (Netzel and
Kaalberg, 1999). Using this type of
information technology, the input
geotechnical data will maintain their
accessibility and their value throughout
the entire project.
Øresund Tunnel Link project
In the context of the Øresund tunnel
Figure 4. Station Rokin on Amsterdam Metro N-S line
and bridge project to link Denmark and
Sweden a subsurface model for
excavatability and dredging spill has been made consisting of a 3D GIS and a Decision Support System
(DSS) (Brugman et al. 1999, Ozmutlu and Hack, 2001). Figure 5 shows the potential spill values in the area
to be dredged.
126
Heinenoord tunnel
In the scope of a research program a
comparison was made of the modelling
of a bored tunnel in three different
(true-) three-dimensional Geographic
Information Systems (Hack et al., 2000,
Veldkamp et al., 2001). The three
programs used were: Lynx GMS,
GoCAD and Intergraph. The modelling
incorporated all available subsurface
data and the geometrical dimensions of
the tunnel. The data were threedimensionally interpolated and these
interpolated data were subsequently
exported to a two- and threedimensional numerical program. The
Figure 5. Volume model of the subsurface of Øresund Tunnel
calculated deformations of the tunnel
Link showing the potential spill due to the materials in the
and the subsequent settlement of the
subsurface. The ground surrounding the channel to be dredged
surface were returned to- and visualized
has been removed and the potential spill values are shown.
in the 3D GIS. The research program
showed that the various programs had
all
particular
advantages
and
disadvantages, but that in general the
Lynx GMS system worked out to be
very suitable and easy to handle for
engineering geological models.
Various data types were used and
incorporated in the GIS. ZeroDimensional sample data (sample
descriptions or laboratory test results)
were simply attached to a threedimensional coordinate. Borehole and
other line data such as CPT and SPT
were introduced as One-Dimensional
data. Two-Dimensional data such as
maps and geophysical sections were
introduced as planes with the correct
orientation in space. Figure 6 shows the
available topographic data in a
geotechnical database that was imported
in the 3D GIS. A Three-dimensional
grid data model had to be created to
export the data to an external numerical
program.
The final subsurface volume model
of the geotechnical units is shown in
figure 7. The detail of the volume
models is related to the size of the
modelled area and the available data.
Over a large area only scarce data was
available, hence the model is less
Figure 6. Example of GeoGis database for geotechnical data
detailed there; near to location of the
planned tunnel site investigation has
been done which resulted in a more detailed volume model.
Figure 8 shows how the volume model at each location in space is linked to the database with as well the
original data as well as the interpreted data.
127
Figure 9 shows how the displacements (the results of numerical calculations on the deformations around
the tunnel based on the data from the 3D GIS) are re-imported into the 3D GIS and projected onto the tunnel
tube and thus visualised.
DATABASES FOR LARGE ENGINEERING GEOLOGICAL DATA SETS
One of the great new challenges for national geological survey organisations is the collection and storage
as well as the organisation of the accessibility of all geological and geotechnical data and information that is
available on the territory of a country. These organisations have changed their role from publishing
geological maps in a systematic coverage of the territory to being a main provider of access to the geological
and geotechnical basic data that are available. The form in which these data are available is progressively
changing from analogue to digital. All new information is being added in a digital form and procedures of
access to external users are developed (Culshaw, 2001, Groot, 2001). For more information on these aspects
see chapter 8 of this keynote. Here two practical examples of large databases developed in The Netherlands
are mentioned.
Figure 7. Volume model: from top to bottom smaller areas, but with more detail
Database on the Underground of The Netherlands Geological Survey (DINO)
TNO-NITG (The Netherlands Institute of Applied Geosciences TNO – Dutch Geological Survey) has
been assigned as a main task to act as national custodian of all geo-scientific data and information of the
subsurface of The Netherlands. This assignment prompted to the development of DINO (Data and
Information on The Netherlands Underground). The DINO design challenge was to provide uniform access
to data and information. An unequivocal interface implies, however, an even more unequivocal approach to
the data: as to how the various tables are interlinked in a database and how the data are presented. The main
aspects of DINO are the following:
128
The data acquired and the data collected from third parties are entered into the database
A quality control process is applied to the newly entered data
DINO the data can be selected and subsequently exported to an analysis and interpretation environment.
Results of analysis can be entered again in DINO
Figure 8. Interactive modelling process
Figure 9. Vertical displacements at the location of the tunnel tube projected onto the tunnel tube, blue small
displacements, red large displacements
129
Apart from lithological descriptions of boreholes (400,000 entries) also the introduction of data on
groundwater data and Dutch Cone Penetration Tests (DCPT) as well as the data on deep borings (generally
deeper than 1000m) and offshore borings will be completed in the course of 2002. For more information
please contact: dinoloket@nitg.tno.nl
Database for an engineering geological mapping program
At the ITC a large digital database was designed to store engineering geological data and information
collected in a program of engineering geological field mapping near Falset (province of Tarragona, Spain)
with large numbers of students from the ITC and Delft University of Technology over a period of almost 11
years.
The main purpose of this database is to facilitate the input process of engineering geological rock and soil
data that is collected during each annual four-week fieldwork period. In addition, this database will be the
core data source for monitoring the temporal and spatial variation of rock and soil properties for different
geotechnical units. Since the database contains thousands of records of very detailed standardised rock and
soil outcrops description over a relatively small area, the database will be a very important source of
information for all kinds of research topics.
Figure 10. General outcrop information form
Figure 11. Geotechnical unit description form
130
The Falset database is implemented in Microsoft Access application. Several input forms and queries are
designed to support the data input and retrieval processes (Figures 10 to 12).
The Falset database provides the following utilities for the end-users:
Graphical user interface (GUI) forms
Drop-down lists for most of predefined values in order to reduce the typing errors during the data input
process.
Calculation of the slope stability of each rock outcrop based on the SSPC system (Hack, 1998)
Figure 12. The database also allows for inclusion of digital photographs
Ability to import or export data for further data processing, such as integration with engineering
geological maps or different thematic maps stored as digital information in a GIS.
GEOTECHNICAL (NUMERICAL AND STATISTICAL) MODELLING
In the Seventies and Eighties there was strong growth in calculation speed and in the handling capacity of
large data volumes (as well as the increased accessibility of calculation power in the form of PC’s). This
enabled the development of new methods of geotechnical analysis (finite element, finite difference, distinct
element, and particle flow methods), allowing for models of irregular forms with the third dimension,
consisting of materials with more complicated stress-strain relationships. The fourth dimension could be
introduced where progressive stages of excavation and/or loading, and or time dependent deformation
processes introduce time as an important factor into the analysis. Departing from numerical methods
developed for applications in the aeroplane design industry, ITASCA (ITASCA, 2002) developed software
packages like FLAC and UDEC (www.itascacg.com) that are at present standard packages and still state-ofthe-art. A new development are the programs PFC2D and PFC3D (Particle Flow Code in two- and three
dimensions - ITASCA, 2002) which model rock blocks as bounded clusters of balls, which is particularly
suitable for situation where many blocks of rock are splitting up in smaller blocks that fall and roll, for
example, in a draw point of a mine. These packages also allow for 2D and 3D continuous and discontinuous
numerical analysis of geotechnical models.
The bottleneck in numerical modelling has shifted from limitations in the hardware (restricting the
quantity of data sets and model elements), towards issues dealing with the quality of the input data. This new
bottleneck concerns the detail with which the engineering geological environment can de defined. There are
131
still too many uncertainties as to what amount of detail the subsurface should be modelled. Particularly, a
proper quantification of the spatial variation in geotechnical properties is very difficult to achieve. This leads
to a paradox in the use of modern IT technologies: "by knowing more, we actually learn that we do not
understand a great deal at all". This may subsequently result in a more conservative instead of less
conservative design of major engineering projects, in order to be certain that all uncertainties and risks are
excluded.
In order to properly quantify those uncertainties and deal with them accordingly, rather than using an
overly conservative design, an appropriate statistical analysis is called for. A promising development in this
respect is the "bootstrap method": a data-based simulation method for statistical inference, representing in
the words of Efron and Tibshirani (1998): "the mathematical equivalent to pulling oneself up by one’s
bootstrap". Starting from a limited amount of true data, a larger set of hypothetical but still realistic data is
generated using simulation techniques such as the Monte Carlo or Latin Hypercube methods (Hammersley
and Hanscombe, 1964). Based on the distribution of the simulated data, bootstrap analysis enables us to
quantify statistical parameters that can only be determined with difficulty, or not at all, from the original
samples (Efron and Tibshirani, 1998 and Hack, 1998). It should be noted that the validity of the results
largely depends on the assumptions that are made in the data simulation, and these should be made with care.
If done properly, bootstrap analysis can be applied to sparse data sets, which are typical in the geosciences.
Its use to date seems however to be restricted mainly to bootstrap percentiles, which are an accepted and
effective method to determine reliability intervals (e.g. Chernick, 1999).
An approach used to reduce the need of infinite detail of input data is to determine with the help of
"sensitivity analysis", the amount of influence of each of the input parameters on the final outcome of the
numerical calculation. Where the traditional calculations only resulted in an average, or a pessimistic value,
now more information is found about the possible variation in the results.
The trend in IT is to make the software packages more and more user-friendly. However, the negative
effect is that they develop more in the direction of black box models, in which a large number of
assumptions have been introduced, that are not anymore apparent to the user. The basic assumptions are
usually well described in the manuals, but users often do not read the manuals from A to Z, or just miss the
appropriate education to understand it. A larger problem is that for more elaborate calculations values of
parameters are needed which are not directly available from traditional field or laboratory tests. Their value
may therefore be based on a rule of thumb of which it is not always clear whether it is valid for the specific
site that is investigated.
Much can be gained from a proper integration of monitoring data, during and after construction, into a
back-analysis of the numerical model. The experience obtained in this way is very valuable for the
development of knowledge on numerical geotechnical modelling. Unfortunately, this is not systematically
done for the simple reason that all parties involved are satisfied when monitoring shows less deformation
than predicted. The money that has been spent for the (probably too conservative) design, cannot be
recovered and nobody wants to be reminded of it.
Numerical packages developed in The Netherlands
In the Seventies geotechnical software packages were developed jointly by Rijkswaterstaat (Ministry of
Traffic, Public Works and Water Management) and the Laboratorium voor Grondmechanica (LGM) in Delft
(presently known as GeoDelft or Delft Geotechnics).
These computer programs were concerned with slope stability using the method of Bishop, settlements
using the method of Koppejan, and sheet-piled walls and diaphragm walls as well as road pavement design
based on an elastic girder supported by linear elastic - fully plastic springs. The most important
improvements in the slope stability programs involved a number of different methods to achieve the best
possible condition of the pore water pressures. This plays a role during execution, when high excess pore
water pressures occur in the soft soil layers, but is also of vital importance during the design stage where a
dike body has to withstand a water level that occurs with a frequency of for example 1:2500 per year.
The research and consulting institute TNO-Bouw developed the finite element program DIANA for the
design of concrete structures. (DIANA, 1998). The user-friendliness of the program was gradually improved.
In the course of the years this program was extended to a soil-mechanical component which is suitable to
study the interaction between foundation structures and soil, but also for continuous rock models. With
DIANA also dynamic effects can be studied. Leenders (2000) used this software package to model the highly
complex influence of terrain surface morphology on the ground motion characteristics for the Quindío
earthquake of 1999, in Colombia.
132
An important numerical modelling package developed at Delft University of Technology (Brinkgreve,
1999) is the finite element program called PLAXIS. The earliest version of PLAXIS was based on linear
elastic-fully plastic stress-strain behaviour of the soil layers. In later years more elaborate models for soil
behaviour were incorporated. With the introduction of interfaces, also sheet-piled walls and diaphragm walls
could be added to the soil models. The latest development is the modelling of the 3D stress/strain condition
around a tunnel-boring machine.
An embankment of 8 m. height for the new High Speed Railway Line Amsterdam-Brussels will be built
as an embankment immediately alongside the existing railway. The deformations of the existing railway due
to the new embankment are very important for the design of the project. The soil below the existing
embankment is made up of approximately 5 m of soft clay and peat. For the calculation of the deformation of
these layers the Soft Soil-model and the Soft Soil Creep-model of PLAXIS were used (Figure 13).
Figure 13. Due to the construction of the future HS-Railway embankment (on the right) are
deformations occurring of the existing railway embankment (on the left). With the finite
element program Plaxis the deformations of the two embankments in the final situation have
been calculated. (from: Spierenburg and van der Velden, 2002).
The calculations indicated that the new embankment will settle approximately 1.8 m., and that the
deformations of the existing tracks would be 0.26 m. in horizontal direction, and 0.28 m. in vertical direction.
These values are sufficiently small, so that they can be corrected by extra maintenance work on the existing
railroad. In particular in situations where horizontal as well as vertical deformations are important finite
element programs are superior to any other, more traditional, calculation method.
A general problem with finite element programs in soils is that they require input parameters that are not
directly available from traditional field and laboratory tests. When input data have to be based on
correlations and estimations, the accuracy of the output may obviously decrease. Many soil parameters are in
fact stress-level dependent. In finite element programs for each soil layer, a Poisson’s ratio is often needed.
This parameter is usually not available as an immediate test result. One can run the calculation with high and
with low estimated Poisson’s ratio values, and judge by the different results how important the parameter is
for the problem that is studied. There remains a risk that for complicated calculations only poorly known
parameter values are used as input. There is still much work to be done in the field of development of
appropriate site investigation and laboratory techniques.
The Slope Stability Probability Classification (SSPC)
Computers allow us to use large amounts and data to optimise very complicated and non-linear relations.
An example is the development of the Slope Stability Probability Classification (SSPC) system (Hack, 1998,
Hack et al., 2002). The SSPC system is based on a three-step approach and on the probabilistic assessment of
independently different failure mechanisms in a slope. First, the scheme classifies rock mass parameters in
one or more exposures. In the second step, these are compensated for weathering and excavation disturbance
in the exposures. This gives values to the parameters of importance to the mechanical behaviour of a slope in
133
an imaginary unweathered and undisturbed ‘reference’ rock mass. The third step is the assessment of the
stability of the existing or any new slope in the reference rock mass, with allowance for the influence of
excavation method and future weathering.
A large quantity of data obtained in the field has allowed the development of a classification system based
on the probabilities of different failure mechanisms. This has resulted in a classification system based on a
probability approach: the "Slope Stability Probability Classification" (SSPC). The SSPC-system has been
developed during four years of research in Falset, Tarragona province, Spain. Recently the system has been
used with good results in Austria, South Africa, New Zealand, and The Netherlands Antilles.
In order to develop the system, data on stability, geometry and physical properties of the rock masses in
which the slopes were made, have been assembled from some 184 slopes and stored in a digital database (see
also section 4.2). Subsequently, various relations were defined between the stability of the slopes and the
physical and geometry properties measured in the field. The relations are non-linear (example Equation 1).
An optimisation routine was used to find the 6 unknown factors in Equation 1 such that the stability of the
slopes was rightly forecasted for the maximum number of slopes. This optimisation has only been possible
because of the number-crunching capabilities of modern computers. By hand such optimisation would have
been virtually impossible.
Equation 1. The various non-linear relations between the stability of the slopes and the physical and
geometry properties measured in the field, used for determining the SSPC system.
dipslope ≥ ϕ mass → H max = 4 *
cohmass sin (dipslope )* cos(ϕ mass )
*
UW
1 − cos(dipslope − ϕ mass )
dipslope < ϕ mass → H max = unlimited
cohmass = a0 *
IRS
+ a1 * SPA + a2 * CD
100
IRS
+ a4 * SPA + a5 * CD
a3 *
π
100
*
ϕ mass =
a3 * a6 + a4 + a5 * 1.10165 2
IRS
≤ a6 → IRS = intact rock strength (in MPa)
if
100
IRS
> a6 → IRS = a6 * 100
if
100
a0 through a6 = unknown factors
dipslope = dip of slope
H max = maximum possible slope height
UW = Unit Weight of the rock mass
For each slope j :
ϕ mass
dip
slope
visually estimated stability = class 1
ϕ mass
dipslope
ϕ mass
dip
slope
visaually estimated stability = class 2 or 3
ϕ mass
dipslope
ER = ∑ erj
≥ 1 ( stable) → er = 1
H max
≥ 1 ( stable) → er = 1
H
slope
< 1
H
H
max < 1 (unstable) → er = slope
H
H max
slope
≥ 1 ( stable) → er =
ϕ mass
dipslope
H max
≤ 1 (unstable) → er = 1
H
< 1 slope
H
H
max > 1 ( stable) → er = max
H slope
H slope
j
A secondary computer-aided feature in the SSPC system is that the results are presented in the form of
probabilities, rather than the point ratings that is the standard in most classification systems. To determine
the probabilities also enormous numbers of calculations had to be done using Monte Carlo techniques for
simulating data out of the distributions of the data used for developing the SSPC system.
134
The bootstrap method (see in the introduction of this chapter) has proven to be useful in the SSPC method
of slope stability analysis. An example of the application of such bootstrap percentiles is presented in
Figure 14, which shows the slope stability probability against orientation independent failure (i.e. not related
to discontinuity and slope orientation), according to the SSPC system (Hack, 1998).
10
Dashed probability lines indicate that the number of slopes used for
the development of the SSPC system for these sections of the
graph is limited and the probability lines may not be as certain as
the probability lines drawn with a continuous line.
95 %
H max / H slope
p rob a b ility to be s ta b le > 9 5 %
90 %
70 %
50 %
30 %
1
10 %
5%
p rob a b ility to be s ta b le < 5 %
0.1
0.0
0.2
0.4
0.6
ϕ’mass / slope dip
0.8
1.0
Figure 14. Bootstrap method applied to SSPC method of slope stability analysis
Using Monte Carlo simulation, bootstrap percentiles were calculated relating the height of the slope (as a
ratio of a maximum slope height calculated in the SSPC to the actual slope height) to the ratio of the rock
mass friction angle to the slope dip. This approach gives the slope stability not as a deterministic but
estimated value (such as Factor of Safety), but as a probability of being stable or unstable. By calculating the
height ratio and the friction angle to slope dip ratio, the probability to be stable can be found; the example
shown in the graph falls on the 5% percentile, meaning that out of every 100 slopes constructed with the
same height and dip angle in the same rock mass, only 5 would be stable and 95 would fail.
Using Monte Carlo simulation, bootstrap percentiles were calculated relating the height of the slope (as a
ratio of a maximum slope height calculated in the SSPC to the actual slope height) to the ratio of the rock
mass friction angle to the slope dip. This approach gives the slope stability not as a deterministic but
estimated value (such as Factor of Safety), but as a probability of being stable or unstable. By calculating the
height ratio and the friction angle to slope dip ratio, the probability to be stable can be found; the example
shown in the graph falls on the 5% percentile, meaning that out of every 100 slopes constructed with the
same height and dip angle in the same rock mass, only 5 would be stable and 95 would fail.
Using Monte Carlo simulation, bootstrap percentiles were calculated relating the height of the slope (as a
ratio of a maximum slope height calculated in the SSPC to the actual slope height) to the ratio of the rock
mass friction angle to the slope dip. This approach gives the slope stability not as a deterministic but
estimated value (such as Factor of Safety), but as a probability of being stable or unstable. By calculating the
height ratio and the friction angle to slope dip ratio, the probability to be stable can be found; the example
shown in the graph falls on the 5% percentile, meaning that out of every 100 slopes constructed with the
same height and dip angle in the same rock mass, only 5 would be stable and 95 would fail.
SPATIAL DATA INTEGRATION AND 2D MODELLING WITH GIS
Geographic Information Systems are computer-based systems for handling map information in a digital
way (Bonham-Carter, 1994, Burrough and McDonnell, 1998). GIS enables the introduction, processing,
storage, and retrieval, of geo-referenced data sets (digital maps, digitised analogue maps as well as point and
line information), and offers procedures to achieve the interoperability of such data sets by modification of
135
projection systems and data formats. The integration of different data sets by overlaying and map
calculations is possible in GIS for spatial modelling processes in which various factors play a role. At ITC
the Integrated Land and Water Information System (ILWIS) GIS has been developed in the Eighties and
Nineties (ILWIS, 1997) The ILWIS package has also well developed Remote Sensing digital image
processing functionalities.
In fact, 2D GIS has already achieved a solid position as a standard IT tool in many businesses and
industries and is therefore not a real novel development. The general public are making use of GIS data
almost every day, whether they are driving in their car (with a GPS navigation system), or whether they are
looking up a street address on the Internet. Obviously, in Engineering Geology, 2D GIS has an established
position as a very useful tool for storing, manipulating and visualising geo-referenced geotechnical data.
However, like many other standard IT packages, such as word processors or spreadsheets, novice users are
often not aware of the advanced functionalities GIS packages have. If an organisation merely uses a GIS for
making "nice maps", it does not recognise the true value of the system. The real added value of using a GIS
lies the fact that the GIS functionalities allow the user to create new information by combining and
manipulating existing information. Through the recent advances in 3D visualising techniques, the GIS user
can also look at his/her data from a wide variety of perspectives, shedding a different light on the data, thus
creating also different types of information. Presently, the main development in the field of 2D GIS remains
with the end-user, whose only limitation is its own imagination.
Figures 15 to 17 illustrate various GIS thematic information shown in 3D perspective view, using a
Digital Terrain Model. These examples are created from actual data gathered by ITC and Delft University
students for an Engineering Geological field mapping exercise in the Baix-Camp area in Spain.
Figure 15. Digital ortho-airphoto draped on top of a Digital Terrain Model
Figure 16. Digital Terrain Model showed in perspective view as a “hillshaded relief model”
136
Figure 17. Engineering Geological rock units mapped out and displayed with different colours
A special training package Geographic Information Systems in Slope Instability Zonation (GISSIZ) has
been developed for educational purposes (GISSIZ, 2002). Slope instability hazard zonation aims at mapping
of areas with an equal probability of occurrence of landslides within a specified period of time. For the
analysis of the influence and interaction of causative factors such as terrain steepness, lithology, land use,
vegetation, etc., GIS is indispensable for the large data handling capacity needed. The Figures 18 and 19
show the principle of spatial data analysis developed in the GISSIZ package.
Figure 18. Principle of preparation of a landslide density map for different
lithological units from a landslide occurrence map
An example of the use of spatial modelling in GIS for flood management is described by Scholten et al.
(1998): GIS showed its great value for flood control measures to be taken by decision makers at the
provincial level during the 1995 floods in the rivers Meuse and Waal. Superimposing different data layers,
GIS experts could detect the weak parts in the dikes and generate an evacuation plan. Shortcomings within
the organization were experienced as well: important data could not be retrieved, and the coordination of
efforts by the different governmental departments and institutions was not optimal. To overcome these
shortcomings and to optimise the modelling process, Scholten et al. (1998) proposed the development of a
spatial decision support system (SDSS), aggregating several models, heuristic and procedural, into integrated
software tools. The system proposed was developed by Simonovic (1993). It combines the use of
optimisation techniques and other numerical methods with GIS, an expert system (ES) assisting in the input
137
of data into the models to be used in flood damage analysis, engineering expertise and database management
software. The architectural design of the Decision Support System (DSS) for flood control management
basically follows the architecture for a Spatial Decision Support System (SDSS) as proposed by Armstrong
et al. (1986) (Figure 20).
The experience with two cases of floods has shown that a new Spatial Information Infrastructure (SII)
needs to be developed. This requires both technical and organizational solutions and standards. Proper
documentation of the data will allow the GIS experts to more quickly find, store, update, and re-use the data.
Figure 19. Principle of the preparation of a landslide activity map from landslide occurrence maps at
different data
Figure 20. Diagram showing the architectural design for a flood management Decision Support System
(Simonovic, 1993)
138
VISUALISATION TECHNIQUES
Traditionally the visualization of engineering geological information was done in the form of maps,
schematic block diagrams and in the form of cross sections, construction site plans, etc. (Dearman, 1991).
IT has opened a whole new spectrum of computer assisted visualization possibilities that surpass the
possibilities of before IT. Efforts here should not be directed towards producing the best possible imitation of
the conventional map product but to develop a better product. In this case “better” products are maps that are
more tailored to the need of the customer. What (engineering) geologists show on a map is an inextricable
mixture of hard fact and interpretation. With computer support, the aspects of the geometry could be sampled
more rigorously in the field and the three-dimensional structure could be recorded and tested for consistency.
The same is valid for computer-assisted processes of interpolation and other aspects of spatial variation of
parameters in the engineering geological model, which cannot be reached when working in a conventional
way. The selection of features and data to be shown in a map, a 3D model or a vertical, horizontal or inclined
cross section (number of contour lines, details of existing infrastructure, drainage lines, but also the detail or
the generalization of the spatial and/or thematic (engineering/geological) information can vary for the
different users.
We generally know more than we are able to express and share with others. The visualization of
(engineering geological) information is subject to cartographic constraints. Scale is the most obvious factor
that influences the detail with which the information in the form of a conceptual model in the mind of the
engineering geologist must be simplified before it can be shown on a screen or printed on paper. The
complexity of nature must be reduced to a few significant and characteristic mapping units.
Furthermore there is obviously a large variation in the information density on the ground and even more
below the surface. This is not only due to the variation in accessibility and exposure of the ground materials
in the terrain and the density of subsurface information sources as boreholes and soundings, but the
information density is also influenced by the relevance of different parts of the terrain or of the stratigraphic
sequence for the specific geotechnical problem that has to be solved.
The process of moving from observation in the field to visualisation involves generalization, showing
only the most important forms and removing unnecessary details. (Buttenfield and Master, 1991). This
process of generalization leads to choices of which type of information will be reduced, a question that
would be answered differently by different types of end users, but must receive an unequivocal answer when
the map is printed in the conventional way. This printed product is a permanent snapshot of the author’s
ideas at a particular time. Revision is costly and therefore infrequent, and in the case of engineering
geological mapping practically never occurring. This means that very often at the time of printing the
information is already out of date. A solution to this problem is to print on demand extracts of the existing
information selected from a Geographic Information System (GIS), using the scale, and the instructions for
generalization as appropriate as possible for the requirement of the user involved.
GIS offers perspectives for a type of approach that frees the users from the constraints of scale,
dimensionality, and cartographic representation of the engineering geological (spatial) model imposed by the
paper map. This creates more freedom to develop multiple conceptual models and scenarios and to express
these more freely in a record that can be shared. Thus modern IT can extend the means of expression and
communication (Kraak, 1999).
This keynote does not cover in detail the aspect of 3D visualisation techniques, which has become a very
specialised industry by itself. 3D visualising has undergone tremendous progress. The multi-billion dollar
movie- and gaming industry have been the driving forces in this. The widespread development, application,
acceptance and distribution of 3D visualising programs (often browser-based, such as VRML) via the
Internet creates also a very positive spin-off in this respect that the end-user is at the moment presented with
affordable, often freely available and advanced 3D visualising programs. Commercial (geo-scientific)
software that caters mostly to a relatively small and specialised group of end-users can obviously gain from
these developments. This in turn, will benefit the geotechnical end-user, which is presented with advanced
visualising techniques at affordable prices.
139
PUBLIC DATA EXCHANGE, DATA QUALITY ASSURANCE AND DATA ACCESSIBILITY
The development of IT has had an enormous impact on the availability, access, and interchange of
engineering geological data and information between data providers, the general public, and engineering
geological experts. Two aspects will be treated briefly in this chapter:
The changing role of the traditional survey organizations,
The improved possibilities of data and information exchange between team members within one
organisation or in different organisations.
The changing role of the traditional survey organizations
National Surveys, including Geological Surveys, have been subject to a lot of pressure to redefine their
role in government and adapt their mandates to an ICT-dominated world. They had to respond to serious
budgetary pressures, implement new technologies and associated staff capacity, establish new and often very
different relationships with their client communities and redesign their organizations to meet the demands for
products and service diversity. Groot (2001) describes an economic model for efficiency in the pricing and
distribution of national geological survey products and services.
The data repositories should provide safe and long term custody of information with ready access to
comprehensive, appropriate, current, coherent and testable records (Loudon, 2000). This means that well
structured metadata and a network of links and cross references must be provided. Furthermore the integrity
of the repository is of greatest importance, coping with past, present and future knowledge and possibilities
to freeze versions as necessary for historical reasons, preferably with linkages to show their relationships
with the metadata of the time.
The client community is no longer solely the professional engineering geologist. Today everybody can be
a user because now we are far more concerned with the consequences of our actions on our environment and
vice-versa. We are concerned about the influence of contamination on our health, about whether ground
conditions and/or future earthquake occurrence will affect the stability of our homes, and the cost of insuring
it and about whether climate change will increase the risk from geo-hazards (Culshaw, 2001). Traditional
geological maps and academic publications do not meet these new needs. However, the increasing
availability of information in digital form and the improving access to it have revolutionized what can be
provided to a wide range of users.
Who will be the users of engineering geological surface and subsurface data and information, and what
are their user demands? We must provide a family of distinctive delivery products, each targeted to a distinct
class of users. We should distinguish between the needs and abilities of sophisticated users, most
"researchers" for example, and those of less sophisticated users (Turner, 2001). While some users can reanalyse or reprocess original data, the general public usually want an answer to their questions and not the
original data, which they probably cannot process and may not understand. How can we help users
understand data limitations and data quality? Experience suggests that the design of the user interface is
critical. The expert wants to enter his/her request rapidly and precisely; she/he wants shortcuts. In contrast
the novice user needs assistance in understanding the options, perhaps by on-line help or tutorials. The
novice user would be mystified by the expert interface; while the expert will become exasperated with the
novice interface.
IT as support for interaction in teams
Communication between participants in a project can be improved by IT, and the closer coordination
improves productivity. Where it is impossible or undesirable for all participants to be accommodated at the
same location, IT can offer good links for an increased mobility of data and information within and between
organizations (Loudon, 2000). IT offers a variety of communications methods, from telephone, fax, email,
voicemail, teleconferencing, file transfer, data sharing/exchanging/networking, to project management. The
interoperability of data provided by the different participants is in this case an issue of the highest
importance.
140
ACCESS TO NEWLY DEVELOPED KNOWLEDGE
The heart of science is the efficient exchange of ideas and the formal dialogue between producers and
consumers of research. For over half a millennium, the printed page has mediated this dialogue. The advent
of electronic publications can improve this dialogue and improve the efficiency of research and transfer of
scientific results. The role of traditional publication as the primary method of communication is rapidly
changing. Electronic publications and network technology are radically altering the relationship between
interpretative results and the underlying data. On the other hand the importance of journals in scientific
communication is rapidly declining, amongst other things, because of the numerous inadequacies of the
traditional editorial and peer review system.
Research institutions can concentrate on assuring broad access to research and data while questions about
the physical location of the primary research materials and final research products becomes secondary.
Information technology makes it also possible to improve the quality and accessibility to what might be
called “non-traditional” research products (such as digital geographic information, unpublished archival
material etc.).
Electronic publication provides the means to create dynamic forms of communication, that can only be
displayed in an electronic environment - forms of communication that use hypertext and relational database
functions to provides text and graphics with which the reader can interact. Many institutes are experimenting
with new forms of on-line publication that assure broad access to research and data and improve application
of research results to societal problems.
Electronic publication permits reproducibility of the research and permits continued manipulation and
enhancement of the research product to better address transfer of scientific results in a form usable to society
or as unforeseen applications of technology. Access to new knowledge as available in scientific literature is
more rapidly and easily by using IT, which allows for quick inventories of contents through keyword
approaches, text scanning; downloading only when suitable for the user.
The scientific publishing industry will probably very soon make a drastic change in the way they provide
information to the end user. Loudon: “Computer mediated communication is likely to replace much print
publication because of the lower publication and library cost (storing and managing books is likely up to
50% of library costs), and because of the improved representation of scientific findings, increased flexibility
and ease of use. IT promises quicker access to more appropriate and up-to-date information at lower cost.
Use of hyperlinks and of forward links through references in future publications.
CONCLUSIONS
The digital revolution and the development of widely different applications of digital techniques has been
enhanced strongly by the ever increasing calculation power of PC’s and mainframe computers in the last
decades of the 20th century. The developments in this field are by far not completed yet and it is likely that
the IT revolution will have a comparably important effect on all aspects of society as the industrial revolution
achieved by making available machine produced power for industrial activities.
In engineering geology the IT applications are partly improving the work with existing methodologies,
but also some completely new ways of working have been developed that were not possible in then pre-IT
era. To be able to use IT to the full extent we will have to rethink our approaches to define and solve the
problems we are facing.
An important conclusion as well is the fact that for most numerical geotechnical modelling the most
important bottleneck for further improvements is not the calculation power of computers but the detail with
which the input parameters can be determined in the field and underground.
ACKNOWLEDGEMENTS
Several persons inside and outside ITC have assisted in the collection of materials for this keynote. We
thank Jacco Haasnoot for his contribution on the N-S metro line in Amsterdam, Richard Rijkers for
providing information on the DINO database from TNO-NITG, Ernst Schetselaar for contributing to the
section on digital field books, Cees van Westen for information and figures on the GISSIZ training package,
Le Minh Son for information and figures on the Falset database and Marga Koelen for her comments on the
influence of IT on the access to knowledge.
141
REFERENCES
Armstrong, M.P., Densham, P.J., and Rushton, G. 1986. Architecture for a micro-computer based decision
support system. Proceedings of the 2nd International Symposium on Spatial Data Handling, pp. 120-131.
Int. Geographical Union, Williamsville, New York, NY.
Bonham-Carter, G.F., 1994. Geographic Information Systems for Geoscientists: Modelling with GIS,
Pergamon, Oxford, 398 pp.
Brinkgreve, B.J., 1999. Beyond 2000 in computational geotechnics: 10 years of PLAXIS International.
International Symposium Beyond 2000 in Computational Geotechnics 1 Amsterdam 1999. Balkema,
Rotterdam, The Netherlands, 313 pp.
Brodaric, B. and Fyon, J.A., 1989. OGS Fieldlog: a microcomputer-based methodology to store, process, and
display map-related data. Ontario Geological Survey, Open File Report 5709, 73 pp.
Brugman, M.H.A., Hack, H.R.G.K., and Dirks, W.G., 1999. Een drie-dimensionaal systeem voor de raming
van baggerprodukties in rots. Geotechniek, April 1999. pp.22-26.
Burrough, P.A, and McDonnell 1998. Principles of Geographical Information Systems. Oxford University
Press, Oxford, U.K.
Buttenfield, B.B. and Mc. Master, R.B. (Eds.), 1991. Map Generalization: Making Rules for Knowledge
Representation” (Symposium Papers). Wiley, New York, 245 pp.
Carr, T.R., Buchanan, R.C., Adkins-Heljeson, D., Mettille, T.D. and Sorensen, J., 1997. The future of
scientific communication in the earth sciences: the impact of the internet. Computers and Geosciences,
Vol. 23, No. 5, p. 503-512, 1997.
Chernick, M.R., 1999. Bootstrap methods: a practitioner’s guide. Wiley, New York, USA.
Coerts, A., 1996. Analysis of the static cone penetration test data for subsurface modelling: A methodology.
Ph.D. Thesis Utrecht University, The Netherlands (ISBN 90-6266-136-X).
Culshaw, M.G., 2001. Geoscience supply and demand: bridging the gap between the old provider and the
new user. In: New Paradigms for the Prediction of Subsurface Conditions. Euroconference on
Characterisation of the Shallow Subsurface Implications for Urban Infrastructure and Environmental
Assessment, Spa, Belgium, July 2001
(http://construction.ntu.ac.uk/graduate_school/Conference/ESF_01/).
CUR, 2002. Geotechnisch Uitwisselingsformaat voor boor-data (only in Dutch), CUR, Gouda. 66 pp.
(www.geonet.nl/cgi-bin/geo.pl).
Dearman , W.R., 1991. Engineering Geological Mapping. Butterworth - Heinemann Ltd, Oxford, U.K., 387
pp.
DIANA, 1998. (Witte, F.C. et al. Ed.). DIANA Finite Element Analysis, User’s Manual Release 7, TNO
Building and Construction Research, Delft, The Netherlands.
Efron, B. and Tibshirani, R. 1998. An introduction to the bootstrap. Chapman and Hall/CRC, Boca Raton,
USA.
GISSIZ, 2002. Introduction to GISSIZ. Manual. ITC, Enschede, The Netherlands (in reprint).
Groot, R.E., 2001. An economic model for efficiency in the pricing and distribution of national geological
survey products and services. In: New Paradigms for the Prediction of Subsurface Conditions.
Euroconference on Characterisation of the Shallow Subsurface Implications for Urban Infrastructure and
Environmental Assessment, Spa, Belgium, July 2001
(http://construction.ntu.ac.uk/graduate_school/Conference/ESF_01/).
Hack, H.R.G.K. 1998. Slope Stability Probability Classification. Ph.D. thesis (2nd edition). ITC International Institute for Geoinformation Science and Earth Observation, Delft, Netherlands.
Hack, H.R.G.K. et al, 2000. 3D Modellering Tweede Heinenoordtunnel. Werkrapport. Centrum Ondergronds
Bouwen (COB-L300) and Land Water en Informatie Technologie (LWI). CUR, Gouda, The
Netherlands. 70 pp. (http://www.itc.nl/enggeo/consulting/lwi/heinenoord/).
Hack, H.R.G.K., Price, D. and Rengers, N., 2002. A new approach to rock slope stability - a probability
classification (SSPC). Bulletin of the IAEG, (accepted for publication, in print).
Hammersley, J.M. and Hanscombe, D.C., 1964. Monte Carlo Methods. Methuen, London 178 pp.
Huijzer, G.P., 1992. Quantitative penetrostratigraphic classification. Ph.D. thesis, Amsterdam Free
University 1993, 201 pp.
ILWIS Department, 1997. ILWIS 2.1 for Windows-User’s Guide. ITC, Enschede, The Netherlands
ITASCA, 2002. Numerical programs : FLAC, UDEC, 3DEC, PFC2D, and PCF3D. ITASCA Consulting
Group, Inc., Minneapolis, Minnesota, USA (http://www.itascacg.com).
142
ITC, 2001. Principles of Geographic Information Systems. ITC Educational Textbook Series; 1 (2nd ed.),
ITC, Enschede, The Netherlands.232 pp.
ITC, 2001. Principles of Remote Sensing, ITC Educational Textbook Series; 2 (2nd ed.), ITC, Enschede, The
Netherlands. 180 pp.
Kainz, W., 2002. Personal communication.
Kraak, M.J., 1999. Visualization for exploration of spatial data. International Journal of Geographical
Information Science 13 (4), pp. 285-288.
Leenders, N., 2000. Three-dimensional dynamic modelling of earthquake tremors. ISSN: 1386-5072, August
2000. Memoirs of the Centre of Engineering Geology in The Netherlands, No 200. Faculty of Civil
Engineering and Geoscience, Department of Applied Earth Sciences, Division Engineering Geology.
Loudon, T.V., 2000. Geoscience after IT, a view of the present and future impact of information technology
on geoscience. Volume 17: Computer methods in the Geosciences, Pergamon, Oxford.
Netzel, H. and Kaalberg, F. J., 1999. Settlement Risk Management with GIS for the Amsterdam North /
South-Line, Proceedings of ITA World Tunnel Congress, Oslo.
POSC, 1993. Petrochemical Open Software Corporation Software Integration Platform Specification.
Epicentre Data Model, version 1, vol. 1, Prentice-Hall, Englewood Cliffs, New Jersey.
Scholten, H.J., LoCashio, AA., and Overduin, T., 1998. Towards a spatial information infrastructure for
flood management in The Netherlands”, Journal of Coastal Conservation 4, pp.151-160.
Simonovic, S.P., 1993. Flood control management by integrating GIS with expert systems: Winnipeg city
case study. In: Application of geographic information systems in hydrology and water resources
management, April 1993, Vienna.
Slob, S., Hack, H.R.G.K., and Turner, A.K. 2002. An approach to automate discontinuity measurements of
rock faces using laser scanning techniques. Proceedings EUROCK 2002 (in print).
Spierenburg, S.E.J. and van der Velden W.H.J., 2002: “Ophogingen HSL trace onder controle (ground work
for High Speed Railway Line under control)”, Land en Water, Volume 42, February 2002.
Turner, A.K., 2001. Putting the user first: implications for subsurface characterisation, In: New Paradigms
for the Prediction of Subsurface Conditions. Euroconference on Characterisation of the Shallow
Subsurface Implications for Urban Infrastructure and Environmental Assessment, Spa, Belgium, July
2001 (http://construction.ntu.ac.uk/graduate_school/Conference/ESF_01/index.html).
Veldkamp, J.G., Hack, H.R.G.K., Ozmutlu, S., Hendriks, M.A.N., Kronieger, R., and Van Deen, J.K., 2001.
Combination of 3D-GIS and FEM modelling in complex infrastructure projects. Proc. Int. Symp.
Engineering Geological Problems of Urban Areas, EngGeolCity-2001. Ekaterinenburg, Russia.
(http://www.itc.nl/enggeo/consulting/lwi/heinenoord/).
Woldai, T., and Schetselaar, E., 2002. Design and implementation of “factual” databases to support GIS/RS
analyses of earth systems for civil society. The International Archives of the Photogrammetry, Remote
Sensing and Spatial Information Sciences, Vol.34, Part XXX, 8 pp.
143