Od Userdoc6.4
Od Userdoc6.4
Od Userdoc6.4
mentation - 6.4
All rights reserved. No part of this publication may be reproduced and/or published
by print, photo print, microfilm or any other means without the written consent of
dGB Beheer B.V.
Under the terms and conditions of any of the following three license types, license
holders are permitted to make hard copies for internal use:
-1-
l GNU GPL
l Commercial License
l Academic License
-2-
Table of Contents
OpendTect User Documentation - 6.4 1
Table of Contents 3
1 Preface 22
1.4 Copyright 25
1.5 Acknowledgements 26
2 Getting Started 27
2.2 Toolbars 34
-3-
2.5 General Selection Window 58
2.6 2D Viewer 60
3.1 Scene 74
3.3 Volume 91
-4-
3.7.1 Add Attribute 144
-5-
3.8.3 Tools 186
4 Survey 228
-6-
4.3.3 Import Cross-Plot Data 252
-7-
4.3.10.1.2.3 SEG-Y Import 304
-8-
4.3.13.4.4 Bulk Well Time-Depth Model Import 368
-9-
4.5 Manage 403
- 10 -
4.5.12.1 Manage 3D Seismics 463
- 11 -
4.5.16.1 Import Wavelet 501
5 Analysis 542
- 12 -
5.1.1 Attribute Set Window 543
- 13 -
5.3.1.3 Location Filters 601
- 14 -
5.5.1.2.3 Ray Tracing 662
6 Processing 683
- 15 -
6.1.3.1 Advanced Angle Mute Paramaters 708
- 16 -
6.5.1 Madagascar Installation 754
7 Scenes 770
8 View 777
9 Utilities 782
- 17 -
9.1.2 Mouse Controls 790
- 18 -
9.3.4 Plugins 826
10 Help 841
- 19 -
11.14 Log 896
- 20 -
16.2 MATLAB Versions/Platforms 1064
16.4 Option 2: Usage Through the MEX Files in MATLAB Itself 1081
Glossary 1082
- 21 -
1 Preface
1.1 About this Manual
This manual is the user documentation of the open source part of the OpendTect
seismic interpretation system. In this document detailed information on windows
and parameter settings are described. The lay-out follows the organization of the
software menus. Information about attributes and filters is given in Appendix A.
This document was written using an online Documentation Tool system. Two ver-
sions are published: an html manual for online use and a pdf version for printing.
The html manual comes with the OpendTect software when it is downloaded from
the OpendTect Installer. Both the html and pdf manual can be separately down-
loaded from the documentation page of the OpendTect website.
While every precaution has been taken in the preparation of this manual, it is pos-
sible that last minute changes to the user interface are not reflected in the manual,
or are not described accurately. Please help us improve future editions by report-
ing any errors, inaccuracies, bugs, misleading or confusing statements you
encounter.
Please note that apart from this user manual, the following additional doc-
umentation manuals exist that can be accessed from the Help menu or down-
loaded from the same documentation page on the OpendTect website:
Other Manuals:
l How-To Instructions - a new document in version 6 that describes How-To apply the
software effectively.
l Training Manual - comes with a 3D data set for self-training. Download from here
l Application manager.
l Software development.
- 22 -
1.2 Release Notes
This is the user documentation for release OpendTect v6.4 - an open source post-
processing, and seismic interpretation system created by dGB.
OpendTect is released via the internet. Users can download the software from the
OpendTect website. It will run without license protection.
Under the GNU GPL license OpendTect is completely free-of-charge, including for
commercial use.
The OpendTect Pro license gives commercial users access to OpendTect Pro, the
commercial version of OpendTect. OpendTect Pro offers extra functionality and
allows commercial users to extend the system with additional (closed source) com-
mercial plugins that can either be purchased or leased. The commercial parts of
OpendTect are protected by FlexNet license managing software. To obtain a
license key for OpendTect Pro and the plugins please contact dGB at info@dgbe-
s.com.
Under the Academic license agreement universities can get free licenses for
OpendTect Pro and commercial plugins for R&D and educational purposes.
l PC-Linux 64bit
l Windows 7, 8 and 10 (32/64bit)
l Mac OS X 10.5
- 23 -
1.3 About OpendTect
The OpendTect suite of software products consists two parts: an open source part
and a closed source part.
The open source part is called OpendTect. This is a seismic interpretation software
system for post-processing, visualizing and interpreting multi-volume seismic data,
and for fast-track development of innovative interpretation tools.
The closed source part is called OpendTect Pro. OpendTect Pro offers extra func-
tionality for commercial users. This system can optionally be extended with a set of
commercial plugins that are for sale and rent. The plugins offer unique seismic
interpretation workflows for specialist work.
Commercial users can create their own system by picking and choosing from an
extensive list of commercial plugins. Bundles of plugins have been combined into
bundles that are made available against discounted prices. The following bundles
exist:
l Geoscience
l Attributes & Filters
l Sequence Stratigraphy
l Inversion & Rock Physics
- 24 -
1.4 Copyright
The information contained in this manual and the accompanying software pro-
grams are copyrighted and all rights reserved by dGB Beheer BV , hereinafter
dGB. dGB reserves the right to make periodic modifications to this product without
obligation to notify any person or entity of such revision. Copying, duplicating,
selling, or otherwise distributing any part of this product without any prior consent
of an authorized representative of dGB is prohibited.
OpendTect license holders are permitted to print and copy this manual for internal
use.
- 25 -
1.5 Acknowledgements
The OpendTect system is developed around concepts and ideas originating from a
long-term collaboration between dGB and Statoil. Most of the system was and is
developed through sponsored projects. We are indebted to all past, present and
future sponsors. To name a few:
l Addax
l ARKCLS
l BG Group
l Chevron
l ConocoPhillips
l Detnor
l DNO
l ENI
l GDF Suez
l Geokinetics
l JGI
l Marathon Oil
l MOL
l OMV
l RocOil
l Saudi Aramco
l Shell
l Statoil
l Talisman
l Tetrale
l The Dutch Government
l Thrust Belt Imaging
l Wintershall
l Woodside
- 26 -
2 Getting Started
2.1 System Overview
OpendTect v6.2 is a project-oriented seismic interpretation system. Projects are
organized in Surveys - geographic areas with a defined grid that links X,Y coordin-
ates to inline, crossline positions. 3D seismic volumes must lie within the defined
survey boundaries. 2D lines and wells are allowed to stick outside the survey box.
It is possible to load multiple 3D volumes with different orientations, bin-sizes and
temporal sampling rates into one survey. Volumes that do not match the defined
inline, crossline survey grid parameters are rotated and re-sampled onto the grid.
Multiple OpendTect surveys can be stored in the OpendTect Survey Data Root dir-
ectory that is created at installation time. Survey Data Root directories can also be
created from the Survey Setup Window (under the Survey -> Select / Setup menu).
- 27 -
OpendTect Applications
Attributes
OpendTect started life as an attribute and pattern recognition system. Attribute ana-
lysis remains one of the core competences of the system. Attributes (and filters) are
evaluated interactively by applying the attribute to a display element. Calculations
are done on-the-fly and it is possible to movie-style evaluate attribute (filter) para-
meters. OpendTect supports an extensive range of standard and unique attributes
and filters that can be combined in any way by computing attributes from attributes.
You can also create your own set of attributes and filters by using maths and logic
(IF..THEN..ELSE, ..AND.., ..OR.. etc.).
OpendTect works with an active attribute set (You can auto-load an attribute-set
which will be active the next time you open the survey. Do the following: open an
attribute set window, then select File > Autoload Attribute set). Only attributes in the
current active set can be used to make displays.You must select an existing attrib-
ute set or create a new one before you can apply attributes. If you wish to test a dif-
ferent attribute or, in case you simply wish to change the parameters of an attribute,
you can do so by modifying the current attribute set.
Attribute computations can be time- consuming, which is why on- the- fly com-
putation is not always efficient. Retrieving data from a stored volume is much
faster. Attribute volumes can be processed in batch mode on a single computer or,
on all computers and clusters OpendTect has access to. Multi-machine processing
requires a bit of extra work in the installation but it is highly recommended in a pro-
fessional setting. How to prepare OpendTect for multi- machine processing is
described in the System Administrator manual.
- 28 -
Memory
OpendTect does not load all data from current projects to memory (RAM), unlike
other Seismic Interpretation Software. Only the necessary data is loaded in RAM
and released as soon as possible. For instance when displaying a stored 3D data-
set along an inline only the traces of that inline are read from disk, converted and
sent to the visualization. When a seismic attribute is displayed along that inline, all
inputs are read from disk before the computation starts. Only the required inline
and possibly some extra traces (if the attribute uses a lateral stepout) are loaded.
As a result browsing inlines, crosslines, Z slices can be slow as it will sys-
tematically read data from disk.
Some modules, which require large amounts of memory to be allocated (like pre-
loading) will check available memory before allocating the required space. If
requirements exceed available free memory an error-message is returned. The soft-
ware only checks the physical memory (RAM). However, most operating systems
also support virtual memory , which extends the physical memory with space
reserved on attached devices (harddisks and/or removable disks). SSDs exhibit
very high input/output speeds even up to a point that SSD IO speeds approach
RAM I/O speeds. Consequently, SSDs offer a cost-effective solution to increase a
system’s virtual memory to several TerraBytes (TB). OpendTect supports using vir-
tual memory through an environment variable OD_USE_VIRTUALMEM that needs
to be set to “Yes”.
Please note that it is strongly recommended to exclusively assign SSDs for virtual
memory usage. OpendTect may become unresponsive if you assign traditional
magnetic harddisk drives. A short tutorial for Windows is given here.
Finally, please note that available - and free memory amounts are displayed in the
lower-right corner of the main OpendTect window. The returned figures are for the
total memory (physical + virtual) that is available / free.
User Interface
- 29 -
OpendTect's user interface is centered on an area that holds one or more 3D visu-
alization scenes. Each 3D scene has its own display tree from which the user con-
trols and manipulates the content of the 3D scene. The tree is NOT a reflection of
the project database! Instead it reflects which data is shown in each display ele-
ment. Data in a display element can be retrieved from the project database, or it
can be computed on-the-fly (attributes, volume builder processing flows, neural net-
work outputs). Display elements can hold up to eight layers of information. This fea-
ture allows co- rendering of attributes and is especially useful when used in
combination with (semi-)transparent color bars.
To populate the 3D scene the user should right-click on the tree element and
select the data. Attributes (neural networks) can only be selected if an attribute set
(neural network) is defined ( , , , icons, respectively). Many display ele-
ments (in-lines, cross-lines, Z-slices, random lines, horizons, 2D lines) can also be
displayed and interpreted using flat (2D) viewers ( , ). OpendTect Pro users
can populate a 3D scene and open 2D viewers via the basemap ( icon).
At the top of the OpendTect main window you will find a series of menus from
which various processes are started. Many processes can also be started from
icons, which is faster and therefore more convenient.
- 30 -
2.1.1 Multi-Machine & Parallelization
To speed up processing time OpendTect supports batch processing of attribute
volumes and other computer intensive processes. The user has the option to run
batch processes on all computers (Linux, Mac or Windows operated workstations
and clusters) in the network. Multi-machine distributed computing is highly recom-
mended in operational settings. Corporate users are advised to contact their sys-
tem administrator to enable multi- machine processing. Please see the
administrator's manual for details.
Multi-threading allows multiple threads to exist within the context of a single or mul-
tiple process. These threads share the process' resources (such as memory) but
are able to execute independently.
Dip-Steering Algorithms
Compute a Steering Cube with Completely re-designed in
BG Steering Nov. 2015
Compute a Steering Cube with FFT Completely re-designed in
Nov. 2015
Apply Full Steering dip-steering in attrib-
utes and filters
Apply Central Steering in attributes
and filters
Attributes
Convolve (all except Wavelet option)
Curvature
- 31 -
Dip Angle
Velocity Fan Filter (=DipFilter)
Energy (all except Gradient option)
Event
Frequency Filter multi- threading imple-
mented in March 2014
Frequency multi- threading imple-
mented in March 2014
HorizonCube Data
HorizonCube Density
HorizonCube Layer
Hilbert
Instantaneous multi-threading improved in
March 2014
Local Fluid Contact Finder
Maths (except when expression is recurs-
ive)
Polar Dip
Position
PreStack
Reference
Scaling (all except scaling type AGC and
stats type = detrend)
Semblance
Similarity
Spectral Decomposition multi- threading imple-
mented in March 2014
Texture
Tutorial
Volume Statistics
Attributes
Constant Steering
Convolve (Wavelet option)
DeltaResample
Energy (Gradient option)
- 32 -
Event Steering
FaultDip
FingerPrint
GapDecon
HorizonCube Curvature
HorizonCube Dip
HorizonCube Spacing
Horizon
Match Delta
Maths (recursive expression)
Perpendicular Dip Extractor
SampleValue
Scaling (scaling type AGC and stats type =
detrend)
Shift
- 33 -
2.2 Toolbars
Many OpendTect processes can be launched most easily by clicking the cor-
responding icon. We distinguish two types of icons:
l Icons that perform one specific action, e.g. moves the current display element
(inline, crossline, Z-slice) N-steps forward.
l Icons that open a menu with multiple options, e.g. launches either the
2D attributes -, or the 3D attributes - definition window.
Icons are organized in toolbars. Some toolbars are always present, others only
appear when their action is appropriate. For example the display element toolbar
(upper left corner) only appears when a display element in the tree is selected and
its content varies with the type of display element.
Please note that if you are missing icons in your version of OpendTect then you
are probably running a version without (some of) the commercial plugins.
- 34 -
2.2.1 OpendTect Toolbar
The OpendTect toolbar contains icons to launch OpendTect specific modules:
starts Attribute vs. Attribute crossplot or Attribute vs. Well Data crossplot
- 35 -
2.2.2 Manage Toolbar
From the manage toolbar you can start management utilities to copy, delete, and
rename various data objects in your OpendTect project.
Opens the Well Manager. This is also the place to create new logs using
OpendTect's rock physics library.
- 36 -
2.2.3 Graphics Toolbar
From the graphics toolbar you can start processes to manipulate the 3D graphics
window. The following options are available:
- 37 -
When this icon is visible, you are in Position mode. Click on the display element
you wish to move or edit. If the element is an inline, crossline or Z-slice, a frame
with handles (green squares) appears around the clicked element. The handles
are used to re-size the frame. Clicking and dragging inside the element is the way
to move the entire element in the in-plane direction. Tip: For accurate positioning of
a data element, use the Position option from the right-hand mouse button pop-up
menu in the tree, or press the position icon .
When this icon is visible, you are in View mode. In this mode, you can rotate,
pan and zoom using Left-Mouse operations.
View Mode is a relic from the past. It is expected to disappear in future releases
because as of v6 rotate, pan and zoom are available in all modes (Position, View,
Interpretation) as Middle-Mouse operations, see: Mouse Controls.
Resets the view to the position that was saved when you pressed the "save
home position" icon, one below.
Saves the current view as the home position that can be recalled with the pre-
vious icon
Allows to reset the view such that all data are visible.
Use this icon to set the scene orientation to a particular 'standard' direction.
Options include 'View Inline', 'View Crossline', 'View Z', 'View North' and 'View
North Z'
The display rotation axis is used to show/hide the N-E-Z (North-East-Z) ori-
entation arrows.
Opens the snapshot window, so that the user can grab pictures of the scene,
window, and/or the desktop.
- 38 -
Sets the selection to polygon or rectangular modes and allows the user to
select an area (or elements within an area, i.e faults sticks...)
Displays selected element only. When more than one element is displayed in
the tree, one can quickly view a single element and toggle between elements by
clicking the different elements in the tree.
- 39 -
2.2.3.1 Take Snapshots
It is possible to take different kinds of snapshots in OpendTect. Three options
are available: Scene, Window, and Desktop.
The Scene option allows the user to grab the displayed (selected) scene within
OpendTect.The OSG (OpenSceneGraph) 3D library is used, allowing the output
picture to have a higher resolution than the screen resolution. Every element dis-
played in the scene will be in the output picture, including the annotations (color
bar, orientation etc) if displayed. Any overlapping windows will be neglected.
You can change the image properties e.g. height, width, resolution etc. If you save
the settings, they will appear by default in all sessions next time you grab new
snapshots. The 'Screen' parameters correspond to the parameters of the picture as
displayed on your screen.
- 40 -
- 41 -
The Window option grabs the whole window, including the sidebars. It can be
either the main window or any 2D viewer opened when the snapshot button was
pressed.
- 42 -
The Desktop option is similar and will snap the entire user desktop. Both options
use the Qt library for grabbing the picture. As a result the output is limited to the
actual screen resolution, and overlapping windows will appear on the snapshot.
- 43 -
2.2.3.2 Directional Lighting
The directional lighting feature is used to illuminate the objects (displayed data)
at a specific inclination (or dip angle) and azimuth. The feature controls the main
headlight i.e. the intensity of the camera light and the intensity of the directional
light. The dialog is launched by clicking the icon shown above.
- 44 -
Directional light Dialog
The directional light dialog updates the scene instantly to reflect the changes made
to the properties. If the OK button is clicked, the changes are retained, whereas, the
Cancel button rules out all changes.
- 45 -
Apply light to: The directional lighting is independent for each scene, i.e the selec-
ted scene will be illuminated. However, selecting the option All in the drop down
list will illuminate all scenes that are currently open.
Camera mounted light: Use this slider to change the percentage of intensity of
the camera light or the head light. 0% corresponds to total darkness while 100%
corresponds to full intensity. Similar to the directional light dialog, the changes
made to the azimuth and dip are instantly reflected in the selected scene(s).
Intensity: Sets the percentage of the intensity of the additional directional light. 0%
corresponds to total darkness while 100% corresponds to full intensity.
Dip: This slider is used to set the dip value (in degrees) of the directional light. The
directional dip is limited from 0 to 90 degrees.
Azimuth: This slider is used to set the azimuth (in degrees) value of the directional
light. It can be any value from 0 to 360 degrees.
Show polar diagram: The azimuth and dip can be visualized using this diagram.
This diagram can be used in combination with the sliders of the main dialog in
order to position the directional light around the scene.
- 46 -
Polar diagram dialog
This is a dialog that displays the polar diagram for setting the azimuth and the dip
values of the additional directional light. The location of the pointer (the red dot)
determines the properties of the directional light. The pointer can be moved around
by using the mouse within the polar diagram. The azimuth value can be read off
the circumference of the outermost circle while the dip value is given by the loc-
ation of the pointer along the radius of the circle.
- 47 -
2.2.4 Slice Position Tool
The slice position toolbar is used to position a display element (inline, crossline or
timeslice) and to step through a 3D volume with a user-defined step. The slice pos-
ition and number of steps are manually entered in the fields. The forward and back-
ward arrows of this toolbar are used to move the selected slice in increasing or
decreasing directions. The workflow is very simple. Add an inline in a scene. By
default, the slice is positioned in the middle of the survey box. Use the arrows to
move the inline position according to the given step.
A progress bar appears by default every time when a user moves the display ele-
ment. This can be avoided by changing the Personal Settings (Utilities, Settings,
Look and Feel). Provided the computer memory is sufficient it is highly recom-
mended to load the seismic data into memory (Survey, Preload). Progress mes-
sages are switched off for preloaded volumes!
Keyboard shortcuts can be used to ease the sliding: Keyboard shortcuts. The step
used will be the one specified in the slice position tool.
- 48 -
2.3 Mouse Controls - Scenes and Graph-
ical Interaction
3D scene interaction
There are three interaction modes for 3D scenes, each has its own cursor:
l Position mode for positioning and moving display elements (arrow cursor)
l View mode for zoom, rotate and pan (hand cursor)
l Interpretation mode for picking and editing data points (cross cursor).
Zooming and Rotating views is also possible using the wheels in that appear in
the upper and lower left corners of the graphical area when you hover over these
corners with your cursor.
Positioning, zooming, panning, rotating and interpreting data in the 3D scene are
core interactions in OpendTect which are done with mouse clicks and drags. Some
operations require holding a Shift or Control key while clicking. OpendTect sup-
ports a range of short-keys to speed up various interactions. The default settings of
the most important short-keys and mouse-controls are given in the tables below:
- 49 -
- 50 -
These tables assume a mouse with left and right buttons and a wheel in the
middle. Touch-pads, and other input devices will support similar functionality but
the actions might be mapped differently. Mouse controls and a number of short-
keys can be modified from the Utilities menu, options Look and Feel -> Settings
menus.
- 51 -
2.4 Color Tables
A colortable is a predefined group of color settings that can readily be applied to
any attribute. This group includes items such as the primary colorbar, undefined
color settings, color segmentation, and opacity. Changes made to the colortable
are applied universally to any item that uses that colorbar.
The colorbar is composed of four elements: the color display itself, the minimum
and the maximum value of the variation for the colortable (as it is defined for the cur-
rently selected item), and a set of colortables.
This drop down list of predefined colortables appears when the user clicks on the
name of the colortable being used in the colorbar (ex: Channels). If no item is selec-
ted in the tree, the colorbar will not show any value/range although it can be manip-
ulated. The colortable is manipulated by right-clicking on the colorbar.
The pop- up sub- menu contains several manipulation functionality. These are
described in the following:
Flip causes the scale to be flipped. (The color assigned to the high value, now
becomes the color assigned to the low value, etc.)
- 52 -
Ranges/Clipping allows the user to change the range of the color scale, or clip a
certain percent of the scale. Please be aware that because of display time con-
suming, only 2000 random sampling points are by default used to clip data. The
clip values thus change from one data set to another.(An alternate method for clip-
ping is described in the Inline, Crossline & Z-slice sub-chapter).
Set as default sets the current color settings as the default color scheme for all ele-
ments in the tree.
Manage is used to modify the current colortable and to create new colortables with
the current one as a starting point. Colortables are modified by adding, removing,
changing colors, varying opacity, and defining the colorbar to be gradational or seg-
mentized. The effect of the changes on your displayed element can be seen dir-
ectly. Colortables can be removed from the list by pressing the Remove button.
(OpendTect Default colortables can not be removed). Moreover, the user can
import user-defined colortables by pressing the Import button.
- 53 -
The Colortable Manager window opens when the user selects the Manage option
described above.
A marker is the color you see in the colorbar. The black lines, in the white field
above the colorbar, are the marker boundaries. The marker boundaries are where
the settings for the markers are defined. Right- clicking on a marker boundary
shows the following options: Remove color, Change color, and Edit Markers.
Import colortable file: The colortables can also be imported by pressing the From
other user... button. The default colortables are stored in a file (ColoTabs) that is
located in the OpendTect installation directory (/root/opendtect- 4.6/data/).
- 54 -
Moreover, the colortables saved by a user are stored in a file (settings_ coltab-
s.user) that is located in the user home directory ($HOME/.od/), here user is the
OpendTect username. These files can be modified or imported by using import
color table window (see below).
Marker color brings up a standard color definition window, where this defined
color can be changed.
- 55 -
Edit Markers opens the Manage Marker window that displays all markers: marker
ID, position, and current color. The marker's position, in relation to low and high val-
ues, can be specified by number. The standard color definition window can be
opened from here too, by double-clicking the marker color.
Opacity: A thin red line, capped on each end by small red nodes, is visible at the
bottom of the histogram located in the top panel. By moving these nodes, or adding
additional nodes, the user can vary the opacity of the colors below. One can add
opacity nodes by double- clicking in this area. These opacity nodes can be
dragged up or down to increase or decrease, respectively, the transparency of the
color directly below it in the colorbar. A hatched area (visible in the color toolbar in
the main window of OpendTect) indicates the part of the color bar that will display
with some level of transparency. The darker the color of the hatch marks, the
higher the level of transparency.
- 56 -
Transparency performance depends on the graphics card. When displaying two
elements in exactly the same position, transparency may not work as you expect. It
may help to set transparency values to the maximum to get the sort of display you
desire. In addition, it may help to change the transparency of the element as a
whole by right-clicking the element in the tree, and selecting Properties.
In the background of the opacity panel, a histogram is shown in light grey. This his-
togram shows the distribution of attribute values in the selected element. This
helps you to tune the colorbar to the value range you may want to highlight. To
alter the histogram see Show Histogram in the Inline, Crossline & Zslice sub-
chapter.
Segmentation allows the user to segment the colorbar into a user-defined discrete
number of colors. This can be done in a Fixed or Variable manner. Fixed allows
the user to define the number of segments they would like to have, but does not
allow the marker boundaries to be moved. Variable allows the user to both define
the number of segments, and move the marker boundaries to suit specific needs.
Fixed is good for purposes such as velocity and contour lines, while Variable is
good for use with waveform segmentation.
Undefined color specifies the color that will be used to display undefined values
in the data.
- 57 -
2.5 General Selection Window
toggles the object to Read only. Use this option to protect any object from over-
writing.
- 58 -
turns the selected object into the default object.
- 59 -
2.6 2D Viewer
The 2D Viewer is a flat viewer for visualizing and interpreting 2D and 3D seismic
data. When zoomed in use middle-mouse click and drag to pan through the entire
section. 2D viewers can be launched in different ways:
l From the main OpendTect window using the View menu, option 2D Viewer.
l From the variable density and wiggle display icons that appear when the dis-
play element is selected in the 3D window.
l From the Basemap utility (only if you have access to OpendTect Pro)
Tree. Each 2D viewer features a tree to populate and manipulate the graphics win-
dow. Use right-click on the object to open the corresponding menu. The 2D Viewer
supports two modes of visualization: variable density and wiggles + variable area.
Per default the same attribute is used in both modes but the user is free to select dif-
ferent attributes per mode (Select attribute menu).
- 60 -
2D Graphics area. For 2D seismic lines you can open a crossing 2D line using
right-click on the annotated cross-line. When the 2D viewer displays an inline you
can open a crossline or a time-slice at any position using right-click. It is also pos-
sible to change the display properties by right-clicking in the graphics area. Altern-
atively use the icon. For details, see Display Parameters below.
Position and Color toolbars. To interpret every Nth line set the step to N and use
the arrows (or Keyboard short-keys to jump to the next line. For details see Slice
Position Tooland Graphics Toolbar.
Display toolbar.
Rubberband zoom.
Zoom in with a fixed step (same zoom for horizontal and vertical).
Zoom out with a fixed step (same zoom for horizontal and vertical).
Zoom in vertically with a fixed step (horizontal zoom does not change).
Zoom out vertically with a fixed step (horizontal zoom does not change).
Sets the home zoom level. There are three different home zoom levels:
- 61 -
Grab the image and save as a picture.
Launch Help.
- 62 -
- 63 -
- 64 -
Please note that under the 'Annotation' tab, the user may choose between different
distance values to be displayed on the Axis-distances:
l Trace number (only for 2D): This will display the traces number on the X axis
l Reference position*: This displays the SP (shot points) number that is available in the
shot point header field of the original SEG-Y
l X-Coordinate: This shows the X coordinate in the axis
l Y-Coordinate: This displays the Y coordinate in the axis
- 65 -
Horizon tracking toolbar.
- 66 -
Interpretation mode toggle. If the toggle is on you can add seed positions to the
horizon you are tracking. In this mode the cursor is a cross. When you toggle this
mode off you are back in Position mode. The cursor is an arrow.
Polygon selection tool. Use this to select areas that need to be removed. Altern-
atively, when you are in manual draw mode, use Control - Left Click + Drag to
remove parts of a horizon.
- 67 -
2.6.1 Pre-Stack 2D Viewer
In this window one or several prestack dataset can be viewed simultaneously. In
this viewer you can:
You can set the gathers to be displayed from this window. The top part is used to
set a grid of regular positions from provided ranges. Keep in mind that you can get
another 2D prestack viewer if you wish to have data from several inlines. From this
- 68 -
regular grid, you set the position where gathers should be displayed. You can also
manually change a crossline number. Press Apply to reload the view.
Multiple gathers can also be added together in the 2D panels by pressing this
icon.
- 69 -
Display of angle gathers
In this window you can set the parameters for creating the angle gather that cor-
responds to the prestack datastore. For each sample of the input seismic prestack
gather, the incidence angle in degrees will be computed and color-coded with the
rainbow colorbar. The seismic data will then switch automatically to wiggle dis-
play.
- 70 -
Example of prestack gathers (wiggle) with a mute functions, and the corresponding
angle gather on the background:
- 71 -
3 OpendTect Trees and Ele-
ments
Each scene has a corresponding detachable tree to control the elements to be dis-
played. A tree consists of several elements of different type, which are described in
detail below. Tree elements containing elements one level down can be expanded
by clicking on the markers (+ or -) left of each main element. The order in which
items appear in the tree can be changed by selecting the item that you want to
move, then press and hold shift, and press the arrow up or down keys on your key-
board.
- 72 -
The tree is a utility window that can be moved outside of the main OpendTect win-
dow. To do so, first click once on the small squares at the top-right corner of the
tree scene, then drag by clicking (left-click) on its title bar to place it anywhere.
Some operating system hide utility windows when the main window is inactive.
A common issue with certain Linux distributions is this: decoupling the tree causes
tree to disappear when using, for example, the attribute engine or when the pro-
gress bar is showing. To resolve this, please do the following:
l Make sure the option 'Hide utility windows for inactive applications' is not checked.
l Click on Apply button.
- 73 -
3.1 Scene
In OpendTect, a Scene is a main working window associated with a tree. In the
scene, it is possible to work in three separate domains i.e. Time-domain, Depth-
domain, Flattened and Wheeler-domain. A time/depth-domain scene can be inser-
ted via Scenes menu. Each scene has its own tree elements so that the elements
and scene settings can be modified accordingly.
- 74 -
In this window, the following options can be set for the specific scene:
l Survey box: If checked, a 3D survey box will be shown in this specific scene.
l Annotation text: If checked, the survey box annotations (inline, crossline, TWT) will be
displayed in the scene.
l Annotation scale: If checked, the numeric values of inlines, crosslines and Z-values
will be displayed.
l Background Color: The user can specify his/her own background color for particular
scene. By default it is black.
- 75 -
l Mouse Marker Size: This option is used to increase the Mouse Marker (a marker point-
ing mouse location on multiwindows) size on various windows in multiscene view.
l Mouse Marker Color: The color of Mouse Marker can be changed from here. The
default color is white.
l Annotation color: The color of Annotation can be changed from here. The default
color is white.
l Line/Surface separation: The user can change the setting between Line and Surface
separation as shown below
This option allows to display images at the top and base of the survey area, with or
without transparency. The picture will be stretched to fit into the TopLeft and Bot-
tomRight corners of the survey (in a north view). The proposed coordinates are the
survey ranges.
This feature is specially useful to add any reference map of the survey to under-
stand the geographical position and corresponding seismic profile in a better way.
- 76 -
- 77 -
You can use the Google export tool to convert the survey boundaries to latitude
and longitudes. They can be then imported into a mapping program (like Google
Earth) in order to take the appropriate screenshot.
The Scene Color Bar Settings can be used to set the position and size of the col-
orbar in the scene:
- 78 -
The Export scene option is only made for the debugging purposes. The export
option dumps the scene information into a .osg file (OpenSceneGraph), which can
be sent to OpendTect and used for bug analysis.
- 79 -
3.2 Inline, Crossline & Z-Slice
Inline, Crossline and Z-slice elements can be added into the tree by clicking on the
element name and selecting Add/Add default data/Add color blended from the pop-
up menu. The Add option will insert an inline/crossline/Z-slice with a blank line in
the middle of the survey (ready for an attribute or stored cube to be loaded). If a
well has been loaded in the project and is within the survey box, it is possible to
Add at Well location an inline or crossline. The inline/crossline loaded will be
according to the surface coordinates that can be found in the well manager inform-
ation summary section. Once added, a selection window will pop up to select the
attribute to display: Stored volume, Attribute from the active attribute set,... If cancel
is clicked, the attribute line will stay blank in the scene and the tree entry will read
<right click>.
Prestack stored cubes will appear in the list surrounded by curly brackets {}. They
can be displayed on the slices of the 3D scene as common offset volumes, in a sim-
ilar way to the components of multi-components volumes.The prefix "O=" is then
presented with the offset value in XY units (meter or feet). Attributes may be com-
puted on common offset volumes like for components or multi-component volumes.
Any desired cube can be selected from the list. The selected cube will be added as
an attribute for the displayed inlinecrossline/Z-slice number. The added attribute
can also be replaced at any time by right-clicking on it (see the figure below).
- 80 -
The check- boxes are used to show/hide the corresponding sub- element of
inline/crossline/Z-slice.
- 81 -
Add
Display
The green lines do not appear on histograms displayed fro single attributes.
- 82 -
Positions: Change an inline/crossline/Z-slice number. This option is used to manip-
ulate (sub-select a range of traces/time) a line or to quickly scroll through the data
for visualization.
- 83 -
(L to R) Manipulate or scroll the inline; Manual Scroll and Auto Scroll
By pressing the Scroll button, elements are moved either manually (select Control
Manual ), or automatically (select Control Auto). Scroll in the inline/crossline dir-
ection by specifying a fixed Scroll step. In the manual mode, the line/Z-slice is
stepped to the new position after each subsequent click on the Advance button. In
the automatic mode, the line/Z-slice is updated in a movie-style with a fixed time
interval (in seconds) - Time between updates. The auto-scrolling can be paused by
pressing the Pause button. To resume the auto-scrolling again press Go button.
Gridlines: Enables displaying grid lines on the particular element. A new menu
appears where the grid line spacing, style, color and width can be set.
Resolution: Edit the graphical resolution of the element. The Default does not
involve any rescaling before the data is sent to the graphic card. The options
Moderate and High do some pre-interpolation before the data is sent to the graphic
card and generally results in a cleaner picture. If the memory of your graphic card
does not allow high resolution, the element becomes black.
- 84 -
If Shading is on, the resolution option is not available anymore(except for the hori-
zon element).
Reset Manipulation: Reset changes made in the position of the line/Z- slice.
Restore the original configuration.
Note: It is also possible to display the offset of each CDP gather similarly to any
poststack data. The prestack data is available in the list of stored cubes and is
marked with quotes { }, at the end of the list.
Correlate with wells: This will correlate the line with 2D well
Lock: Lock the selected object. Prevents accidental removing, moving or dis-
playing data on the object. After clicking Unlock, all editing is again enabled.
Remove: This removes the element from the OpendTect tree and the graphics
area.
- 85 -
The options available for attribute pop-up menu list are briefly described here:
Select Attribute: When selected, data can be displayed from stored cubes or an
attribute from the current attribute set (if available). To display an attribute, select or
create an attribute set first.
Save Colour Settings: Save color settings for a specific stored volume and make
them available for future use.
Move: Move the attribute up, down, to top of the list, or to bottom of the list.
Display
Show Histogram: Display data statistics (selected attribute) of the defined volume
as a histogram in a pop up window.
- 86 -
Show Amplitude Spectrum: Amplitude vs frequency plot will be shown in pop up
window. Moving the mouse over the spectrum displays the Values.
- 87 -
Show F-K Spectrum: A two-dimensional Fourier transform over time and space
where F is the frequency (Fourier transform over time) and K refers to wave-num-
ber (Fourier transform over space).
- 88 -
Pressing Ctrl+P in either the Histogram, Amplitude- or F-K Spectrum windows
pops up a settings window where you may define parameters for a snapshot:
- 89 -
l Change transparency: Change the transparency of the attribute item to view one or
more overlaying attributes simultaneously.
l 2D Viewer - VD / Wiggles: Display an attribute in the 2D viewer as "Wiggle" or "VD"
(Variable Density). For more details, please refer to: 2D viewer
- 90 -
3.3 Volume
A volume can be added by clicking on Volume element in the tree and selecting
Add option. A small volume box with blank attribute is added to the scene. An attrib-
ute in the newly inserted volume can be displayed by right clicking on the volume
and selecting the Select Attribute option. This works similar to inline/cross-
line/Zslice.
You can display either the stored volumes or calculate the attribute within the sub-
volume. For faster response times, pre-load the data you wish to visualize using
this tree element.
The pop-up menu for the Volume element resembles that described in the previous
section for inline/crossline/Z-slice:
Display
- 91 -
Properties: Change display parameters such as transparency and ambiance
reflectivity:
Add isosurface: Compute arbitrary iso value surfaces and convert them into bod-
ies.
- 92 -
If the option Add > Iso surface is selected, the following window pops up with
choice of modes:
- 93 -
- 94 -
The window displays the histogram of the data collected within the loaded volume
(left), or from seeds only that are stored in a pointset (right). "Update" will update
the display in the 3D scene (requires some computation time) while leaving the
window open. OK will accept the currently selected (or updated) value and dismiss
the selection window.
- 95 -
Right-click to display the iso surface menu and convert the iso surface into a stored
body that in turn can be retrieved:
In Interact mode, (see Mouse Controls) the cursor will return the position (inline,
cross-line and X,Y,Z) and the data value at that position in the horizontal status bar
of the OpendTect window.
- 96 -
3.4 Random Line
If you click on the Random line in the tree, four options will be available: Add
Empty, Add Stored, Add Color blended and New.
Add Empty: Right-click on Random line and select 'Empty'. The new line will be
added as a sub-element of the random line. By default, this is the centre inline of
the cube. To create the new arbitrary direction of random line, the user can modify
nodes by editing or inserting nodes:
With multiple nodes, the random line can also consist of multiple flat sections. The
sections of one single random line may intersect one another. In interact mode, the
little plane of a node can be used to drag the node laterally, and the vertical tube
can be used to shift the edge of the random line vertically. Nodes can be added
from the pop menu by right clicking on the random line in the Interact mode.
- 97 -
Add Stored: Select from a list of (previously stored) random lines to display it in
the scene.
Add Color blended: A color blended Random Line may be added. This may be
either a color blended version of a previously-stored random line, or an 'Empty'
color blended random line:
- 98 -
RGB(A*) color-blended attribute display is used to create a normalized color-blen-
ded display that often show features with greater clarity and enhances a detail map
view. Though traditionally, it is used to blend the iso-frequency responses (Spec-
tral Decomposition), RGB(A) can also be used to blend three or four different attrib-
utes that define a comparable spectrum. For instance, spectral decomposition
outputs the amplitude at discrete frequencies. So, it renders the same output (unit-
t=amplitude). Depending upon a geological condition or the objective, FFT short
window or CWT (continuous wavelet transform) can be chosen.
* Once you have your inputs selected for the appropriate color attributes, it is also
possible to add a fourth attribute (the 'A' or 'Alpha channel') to highlight structural
features such as faults/fractures (ie: add a similarity attribute).
- 99 -
New: There are several ways to create a new random line:
l Interactive: When creating a random line from interactive mode, a horizon or Z slice
must be loaded in the scene first, a random line can then be created by picking nodes
on the displayed horizon/Z-slice.
l Along Contours: Create random lines between specified contour ranges. Note that a
interpreted horizon grid will be required to provide the contours.
l From Existing: Generate random line(s) from existing random line(s). There is an
option available to generate random line at some distance away from existing random
geometry and store it in new random line geometry.
l From Polygon Create random line from a saved polygon.
l From Table: Create random line from table. The input will be X/Y coordinates,
Inline/Crossline and Z ranges.
l Create From Wells: Connect several wells by a random line. The line follows the devi-
ated well paths (optional). By right clicking on the random line tree, and selecting
Create from wells, a dialog box appears with a list of wells that can be selected in
order to set up the random line path.
When right-clicking on the newly created random line, the following options are
available in a pop-up menu:
- 100 -
Add
l Add attribute: When selected, choose to display data from stored cubes, from an attrib-
ute from the current attribute set or from an output node of the current neural network.
To display an attribute or neural network, select or create an attribute set or neural net-
work first.
l Add Volume processing attribute: Display volume created from the volume builder
l Add HorizonCube display: Display the stored HorizonCube
l Add System tracts display: This option will add systems tract interpretation.
Display:
l Histogram: Displays multiple histograms for the randomline. If there are more than
one attributes displayed, it will show the histograms of each in a pop-up view.
l Resolution: Choose the resolution between standard/higher/highest
l Position: It is used to manipulate the nodes / position of a random line. To read more,
please go to the Manual mode sub-section of this chapter.
l Insert node: Insert a new node before the selected node.
l Properties: This option refers to display parameters such as Ambient reflectivity, Dif-
fuse reflectivity, Transparency.
- 101 -
Duplicate: Duplicate the line as an empty element in the tree. This option displays
different attributes on the duplicated line whilst keeping the original data.
Reset Manipulation: This will reset any change in the position of the random line
(or its nodes) that you have applied and it will set line to its original position. This
option is only available if changes have been made to the position of the element.
Create 2D Grid: The random lines (with two nodes only) can be used to create a
2D grid with a fixed grid spacing. When selected, the Create 2D Grid window is
launched (see below). Here, specify the input 3D seismic volume and the output
data set name. The output grid is generated according to the dip (parallel) and
strike (perpendicular) direction of the selected random line. The prefix labels are
used as prefixes to the output line names, stored to the specified new data set
name. The grid spacing is the constant spacing between the two lines. At the bot-
tom, the total number of parallel and perpendicular lines will be updated according
to the grid spacing. By pressing OK, a batch process will start to generate the 2D
grid. When the batch program is finished, the data can be displayed in the scene
(see 2D Seismic section for details).
Save As: Save the random line as a new name or overwrite the existing.
- 102 -
Save As 2D: Creates a 2D line from a Random line. Right-click on the random line
in the tree and select Save As 2D. A window will pop up, as shown below. Select
the Input cube, the output line and the line name. The first trace nr number of line is
also necessary.
The survey type should be 2D as well if you want to view the 2D line created from
a random line.
Correlate with wells: This option is used to correlate a random line with wells.
Well - seismic correlation is normally done in the Well Correlation Plugin (WCP),
which requires a commercial license.
Lock: Locks the selected object. This will prevent accidental removing, moving or
displaying data on the object. After clicking lock again, editing is again enabled.
Export to Google KLM: Export selected random line to a Google KML file. Specify
the KML file parameters in the pop-up dialog.
- 103 -
Annotate the start and end of the random line with a user defined line annotation in
the output file settings.
Remove: Remove the random line from the tree and the scene. Do ensure to first
'Save' (any changes to) the random line before removing it.
The options available for attribute pop-up menu list are briefly described here:
- 104 -
Select Attribute: When selected, data can be displayed from stored cubes or an
attribute from the current attribute set (if available). To display an attribute, select or
create an attribute set first.
Save Colour Settings: Save color settings for a specific stored volume and make
them available for later use.
Move: Move the attribute up, down, to top of the list, or to bottom of the list.
Display: There are several display settings / features that are briefly explained
below:
l Show Histogram: Display data statistics (selected attribute) of the randomline as a his-
togram in a pop up window.
l Show Amplitude Spectrum: Amplitude vs frequency plot will be shown in pop up win-
dow.
l Change transparency: Change the transparency of the attribute item to view one or
more overlaying attributes simultaneously.
l 2D Viewer - VD / Wiggles: Display the selected attribute in the 2D viewer as "Wiggle"
or "VD" (Variable Density). For more details, please refer to: 2D viewer
- 105 -
3.4.1 Manual Mode (Empty)
Manual Mode. In manual mode, the random line will first be displayed in the 3D
scene. Nodes may be added and their position changed interactively, in a second
step. This starting random line will have two nodes, one at each end of the central
inline. More nodes can also be inserted in the right click menu of the random line in
the tree (see figure below). Please note that the same menu is available with the
right-click on the random line in the scene.
The node on the left-hand side of the newly created random line is designated
node 0, and the one in the right hand side node 1. It is possible to insert a node
before node 0, before node 1, and after node 1. The node will be created half-way
between the two surrounding nodes. In order to move a node to a desired position,
click on the random line to make the nodes visible/editable. In the interact mode,
click at the node plane (horizontal/vertical) to move the node location. A purple sur-
face appears around the node and the node can be moved in any direction inside
the survey area.
- 106 -
The node can be moved in two directions (horizontal and vertical). The node's ori-
entation can be changed by placing the mouse pointer over the node and pressing
the Ctrl key.
Editing or modifying the position of the nodes is also possible through clicking the
option Edit nodes.... The following windows will pop-up and the nodes are edit-
able. Modifying or inserting new nodes is also enabled. In this table, each node is
defined by its inline/cross-line or X/Y position. The nodes can also be removed by
right clicking over the desired cell and selecting the 'remove node' option. Sim-
ilarly, for the pop-up menu, more nodes can be inserted before/after the selected
cell (node).
- 107 -
3.4.2 Create from Existing
This option allows the user to generate random line offset from an existing random
line. There is an option available to generate a random line at some distance away
from existing random geometry and store it in new random line geometry.
Create Random line from existing line geometry in left/right or both directions. The
direction is defined by the path described by the nodes, in the order seen in the
table.
The first generation parameter is the input random line, which has to be chosen
between the already existing random lines. Then, define the distance from input in
meters and the direction in which the node will be added. There are three dir-
ections: left, right, and both. The final step is to name the output random line.
Click on the Display Random line on creation box to immediately visualize the ran-
dom line.
- 108 -
3.4.3 Create from Polygons
This option allows the user to create random line definition from already created
polygon. In the parameters, select the existing polygon and sub-select the Z-range
for the random line, which will be generated. Write an output name for this random
line and optionally, set check to display random line on creation so that after cre-
ation it will be displayed in the scene/tree. Press OK to proceed.
- 109 -
An example random-line generated along the white colored polygon. The polygon
approximates the closure of a gas anomaly.
- 110 -
3.4.4 Create From Wells
A random line can be created in such a way that it follows wells path. By right-click-
ing on Random line in the tree, and selecting Generate > From Wells ..., a dialog
box appears with a list of wells that can be selected in order to set up the random
line path.
Use the arrows to add and/or remove wells. Use the second set of arrows to setup
a well sequence. Specify whether you want to use only the well top position or not.
When you use all well points, you can specify the order by clicking the Change
Order arrows.
The Extend outward allows the extension of the random lines in both sides away
from wells.
Press the preview button to see a top view of the random line that will be created. If
the preview does not show exactly the desired random line, then change the para-
meters (the wells involved or the order in which they are listed). You can save the
newly created random line by specifying the name in Output Random line(s) field.
If you want to display the random line on creation, check the box Display Random
Line on creation.
- 111 -
The following picture is an example of random line created from wells.
- 112 -
In this picture, a random line goes through four wells following a random path
between these wells (which are used as constraints).
- 113 -
3.4.5 Create From Table
This is launched from: Random line > right click > New > From Table
This allows the user to create a random line from table. The input here are whether
X/Y coordinates or Inlines/Crosslines and Z range.
- 114 -
- 115 -
3.4.6 Interactive Mode
This option is launched via right-clicking on Random line > New > Interactive...
This allows the user to create random line from interactive mode. A horizon or Z
slice is first loaded in the scene, then a random line can be created by picking
nodes.
The user can now pick nodes on Z -slices or Horizons, as shown below:
- 116 -
An attribute can then be displayed:
- 117 -
3.5 2D Seismic
By clicking on the 2D Line entry in the tree, it is possible to either add 2D seismic
lines, create a 2D Grid form 3D data, create new lines from 3D data or generate 3D
cube from 2D data set (see picture below).
Chose how you would like the 2D line(s) to appear in the scene from the three
options shown below when clicking Ok:
l Display projection lines only: shows only the position of the 2D line(s) at the top of
the survey.
- 118 -
l Load default data: the 2D line(s) are loaded into the 3D scene and displayed with
the data selected as default in the 'Manage 2D Seismics' window.
- 119 -
l Select Attribute: loads the projection lines into the scene and brings up the 'Select
Dataset' window to choose what to display: stored, steering (if present) or attributes
from the active attribute set (if present).
- 120 -
l Create 2D Grid from 3D...: This option can be used to create a 2D grid with a fixed
grid spacing. When selected, the Create 2D Grid window is launched. Here, specify
the input 3D seismic volume and the output data set name. The output grid is gen-
erated according to the dip (parallel) and strike (perpendicular) direction of the selec-
ted volume. For more detail, go to 6.1.2.1 Create 2D Grid.
l Extract from 3D...: Extract 3D data onto selected 2D lines. Input data is required in
the form of a stored 3D volume. One or more 2D lines can be selected for the 3D data
to be extracted onto. (Note: If just one line is selected, you may also sub-select a trace
range.) The output data set needs to be given a name.
Once several lines displaying data are loaded, additional actions are available
(see picture below). Selections can be made for all displayed lines.
- 121 -
l Add attribute: Select an (additional) attribute to be added to the lines. (See above for
details).
l Replace attribute: Select an attribute from those displayed on the line(s). Once selec-
ted, this will launch the 'Select Dataset' window and a replacement can be chosen.
l Display attribute: Choose which of the available attributes to display.
l Remove attribute: Remove one of the loaded attributes (Only available if lines in the
tree have more than one attribute loaded in the tree).
l Display Attribute: Checks on the selected attributes to display them on the lines
showing in the scene.
l Hide attribute: Checks off the selected attributes to stop them displaying on the lines
showing in the scene.
l Edit Color Settings: Select an attribute and set the color bar and ranges:
l Display all/Hide All: Display or hide the line names, 2D planes and line geometry
(projection lines).
l Show/Hide all items: Shows or hides all lines in the scene and checks/unchecks the
line names in the tree.
l Remove all items: Removes all lines from the scene and the tree.
- 122 -
For each individual 2D line loaded in the tree, the right-clicking menu gives the fol-
lowing options:
l Add: Add either an attribute from the selection pop up window, a HorizonCube dis-
play (if plugin available) or a System tracks display (if plugin available).
l Display: Allows the following:
l Histogram: The histograms of all attributes in the tree can be displayed using the
right-click option of the parent element (inline number, surface name...). It is a useful
tool to clip the ranges of an attribute. The vertical green lines show the current amp-
litude range and can be moved left or right using the left mouse-click and drag. The
display is updated when the mouse click is released. Please note that this will toggle
off the automatic clipping. Histograms can also be displayed for each attribute inde-
pendently.
- 123 -
l Properties: Use this option to set the Material, Texture and Line Style properties.
o Material: Set the base color for the projection line (2D geometry) and set the
l Duplicate: Duplicates the 2D line as displayed in the scene, including its displayed
attributes. Can be a useful option to compare colorbar settings, or be used to 'extend'
the eight-per-line attribute limit by replacing existing attributes with others on the
duplicate.
l Lock/Unlock Treeitem: Lock the selected object. It prevents accidental removing,
moving or displaying data on the object. Clicking "Unlock" enables editing again.
l Remove from Tree: remove that 2D line from the tree and thus from the 3D scene.
- 124 -
3.6 Pointset & Polygon
A pointset is a set of locations. They have multiple uses in OpendTect such as for
data extraction in crossplot or neural network workflows.
The drop down menu gives the option to Add… an existing pointset or to create a
New one. The new pointset will either be manually picked in the 3D scene or auto-
matically generated.
A Polygon is a close line defined as connected points. It defines an area that can
be used to define an area of subselection for example. The drop down menu gives
the option to either Add… an existing polygon or to create a New… by manually
picking on a loaded surface (either horizon or Z-slice) in the 3D scene.
- 125 -
3.6.1 Manual & Empty Pointsets
When an empty pointset is added to the scene, the locations (of object) can be
picked manually. This type of pointset is generally used for supervised neural net-
work training (see the dGB plugins help documentation).
4. Start picking locations in the 3D scene on data displayed in the 3D scene (inline,
crossline, z-slice or horizon).
l To start picking, please make the pointset active by clicking on it in the scene or in
- 126 -
- 127 -
Example of manual picking. These points will be used in Neural Network training.
- 128 -
3.6.2 Generate Random Points (3D)
Randomly generated points are very useful especially for property prediction or
object detection. This type of pointset has been defined in first place for unsu-
pervised neural network training (see dGB Plugin documentation for more details,
specifically: Unsupervised waveform segmentation (UVQ)).
- 129 -
A user-defined number of random points are created depending on the specified
location type. If selecting:
l Range: specify the inline, crossline and time range (or depth range if in a depth sur-
vey)
l Well: select one or more well(s) and specify a time range (or depth if in a depth sur-
vey). Optionally add traces in the inline and/or crossline direction around the well
track for the location selection.
l Polygon: select a stored polygon and specify the time range (or depth range if in a
depth survey).
l Table: select either an already saved pointset or a Table file. The table file needs to
be X-Y-Z with no header.
l Horizon: select a horizon and select if you want the points to be extracted along the
selected horizon or the interval from that selected horizon to a second horizon to be
selected.
l Body: select if you want the extraction to be inside or outside the body that you selec-
ted. If outside, you need to specify the inline, crossline and z ranges for the extraction.
By default the outside box is the full survey box.
- 130 -
The number of random location to be extracted needs to be specified.
The name can always be changed later from the pointset manager. The
color can also be modified later on from the right-click menu on the pointset
name > Display > Properties.
- 131 -
3.6.3 Generate Random Points (2D)
Random points for the 2D data can also be used for the same purposes and wor-
flows as random 3D picks.
The 2D pointset creation window (see below) is launched by clicking in the tree on
pointset/Polygon and selecting in the drop down menu New pointset> Generate
2D...
The number of points to be in this given set needs to be specified. As this extrac-
tion is done in 2D, the 2D line(s) where the locations will be extracted need to be
selected. The location can be restricted with the Geometry selection (see below). It
depends upon the purpose/objective. For instance, if the objective is to detect
facies by using random vectors (points) on a surface, then horizon geometry shall
be provided.
- 132 -
If selecting:
l Z Range: specify the inline, crossline and time range (or depth range if in a depth sur-
vey)
l On Horizon: select a horizon alomg which the positions will be picked.
l Between Horizons: select two horizons in between which the points will be extracted.
The name can always be changed later from the pointset manager. The
color can also be modified later on from the right-click menu on the pointset
name > Display > Properties.
- 133 -
3.6.4 Polygon
A polygon is a close line connecting locations.
The name can always be changed later from the pointset/Polygon man-
ager. The color can also be modified later on from the right-click menu on
the polygon name > Display > Properties.
Double-clicking will close the polygon. Save the polygon by right-click on its name
in the tree > Save. When the polygon is active in the tree, each click will results in
a new point. To remove a point, press Ctrl and click on the point to delete. Move a
point by clicking on it and dragging it.
In the following picture we can see two examples of polygon pointsets, closed poly-
gon (deltaic facies belt), and non closed polygon (fault pointset).
- 134 -
- 135 -
3.6.5 Pop-Up Menus
pointset/Polygon Element Pop-up Menu
This is the menu right-click menu accessed from the main pointset/Polygon ele-
ment when at least one pointset/polygon is already loaded (see pictures below).
- 136 -
When more than one pointset/polygon are loaded in the tree, the menu has addi-
tional entries:
l Show all items: Display all the pointsets/polygons from the tree in the 3D scene.
l Hide all items: Unselect all the pointsets/polygons from the tree. They are no more dis-
played in the 3D scene.
l Remove all items: Remove all the pointsets/polygons from the tree
If at least one pointset or polygon is loaded in the tree, then the following options
are available from the right-click menu (see picture below):
- 137 -
l Calculate Volume: In OpendTect, an estimated volume can be computed from a poly-
gon to a given surface. The velocity default is set to 3000 m/s. Negative thicknesses
can either be discarded or taken into account.
- 138 -
l Convert to body: (for pointset only) Convert the pointset into a body.
l Create Body: (for polygon only) Create a body using the polygon as a constraining
area. It requires a top and bottom horizon between which the body would be created.
This feature is only enabled if SSIS plugin is loaded (or licensed). F
l Close polygon: (only when interpreting a new polygon) During, and at the end of a
picking session, pointsets should be stored.
l Display:
o Display only at sections: Display points only intersecting the displayed elements
(plane(s) and/or horizon(s)) in the 3D scene. This mode allows to pick new loc-
ations without being distracted by previously picked points throughout the survey
volume.
o Properties: In this window the Type, Size and Color of the point markers in the 3D
scene can be set. The type Arrow is also automatically used when the point is
given directional information in the Set directions option under the pointset pop-up
menu.
o Set direction: (for pointset only) Display direction, guided by the Steer-
ingCube/attribute. This helps to understand the geological dips and fluid flow. It is
assigned by setting a direction to each point based on dip and azimuth information
(attributes). In the pop-up window (see below), specify either a SteeringCube or
two attributes providing the polar dip and azimuth in degrees. A velocity of 2000
m/s will be used in time survey to convert the dip from degrees to µs/m if the dip
angle data is read from a stored cube instead of the dip angle attribute. Do not for-
- 139 -
get after setting the directions to save you pointset and change the display type to
"Arrow".
When setting the direction for a given pointset, you can select to get the direction
from a SteeringCube or from the Azimuth and Dip attributes (stored or on the fly)
(see picture below)
- 140 -
l Save/Save As: Either overwrite the stored input by using option Save or store as new
pointset / polygon by using Save As option.
l Lock / Unlock: Lock the selected object. It prevents accidental removing, moving or
displaying data on the object. Clicking "Unlock" enables editing again.
l Remove: Remove pointset/polygon from tree.
l Export to Google KML: Export selected polygon to a Google KML file. When selec-
ted, the following export window is launched. Fill in the output KML parameters and
write/select the output file location. Press the 'Ok' button to export the polygon in the
selected location. The feature will prompt an additional conversion dialog if the con-
version settings for the survey are not defined. For further information, please refer to
the Survey Selection section.
- 141 -
3.7 Horizon
An existing horizon can be displayed in the scene by selecting Add option from the
pop-up menu (see above). It will launch a horizon selector window from which mul-
tiple horizons can be selected. See also Add color blended.
Once at least one horizon is displayed, there is an addition to the pop-out menu,
Display All, which contains several options: only at sections, in full or both at sec-
tions and in full. Only at sections results in a horizon display (as a line) on the
inline/crossline/timeslice. Full displays the complete horizon in 3D space.
- 142 -
Track new sub menu is used to start a new horizon interpretation.
The popup menu from a displayed horizon has several options, which are covered
in the following sections:
- 143 -
3.7.1 Add Attribute
This allows choosing the data to display on the horizon from stored cubes, a cal-
culated attribute from the current attribute set, or horizon data that were included
with the horizon already. For Horizon data a dialog will popup where you can
select multiple data files. After loading you can browse through the data by press-
ing the 'Page Up' and 'Page Down' buttons on your keyboard.
For PgUP and PgDn to work, the mouse pointer must be in the scene.
Once a horizon is added (with its Z-values displayed in the scene), it is possible to
also right-click on 'Z-values' in the tree to give you other options:
- 144 -
Furthermore, once a horizon has an attribute displayed, it is also possible to 'Save
as Horizon Data'... and will be visible in the 'Manage 3D Horizons' window:
- 145 -
3.7.2 Color-Blended Display
ð becomes ð
The load color blended sub-menu displays an RGBA (red-green-blue and alpha)
blended horizon(s) in the scene. This is used to blend multi-attributes with similar
spectral outputs. This is an interactive tool especially to color blend the iso-fre-
quency grids (or attributes).
- 146 -
A color blended map view (image on right) of the spectral decomposition (red-
10hz, green- 20Hz, blue- 40hz). Compare the results with the coherency map
(image on left). Note that the yellowish colored fault bounder region is thicker as
compared to the surrounding regions. The faults throw (red-color) are also clearly
observable. Semblance/similarity together with color blended spectral images can
reveal better geological information.
- 147 -
3.7.3 Tools
Several processing algorithms may be applied to horizon and will be described
here:
- 148 -
3.7.3.1 Grid
This utility is used to grid/interpolate a horizon having gaps/holes or to filter (aver-
age/median) a horizon grid. There are several gridding algorithms supported in
OpendTect.
- 149 -
Gridding Parameters:
l Geometry: There are different types of geometries that are used to do interpolation.
The Full survey is used to interpolate (in-/out-wards) the Horizon-Z values within the
entire survey box. The Bounding Box defines the rectangle fitting the horizon geo-
metry, which is generally smaller than the survey box. The Convex hull type of area fit-
ting also restricts the gridding geometry within the horizon boundaries. To grid the
gaps or holes in a horizons, the Only holes type of gridding geometry is used.
l Inl/Crl step: The default steps correspond to the sampling rate of the input horizon.
The step can be decreased up to the survey sampling rate to get a higher resolution
horizon.
l Algorithm(s): Inverse distance algorithm uses an inverse distance method of inter-
polation. Inverse distance requires the search radius with optional parameters (step-
size and number of steps). The step size of '1' means that one bin would be used in
all directions to interpolate the horizon Z-values. Whereas the number of steps define
the number of concentric circles for inverse distance computation. For these steps, the
grid computation can be set to the corner points for the defined radius or not (default
option).
- 150 -
l Continuous Curvature (GMT) is a continuous curvature algorithm of interpolation,
which is a part of the GMT Plugin of OpendTect. Please check the GMT website for
further details. This algorithm only requires the tension parameter (ranges from 0-1),
which controls the smoothing. The tension 0 gives minimum curvature type of surface
interpolation, while the tension of 1 gives a harmonic surface.
l Nearest Neighbour (GMT) is also another interpolation algorithm coming from the
GMT Plugin of OpendTect. This algorithm requires the search radius to be defined. It
is mostly useful for a regularly spaced grid data. Please check the GMT website for fur-
ther details.
- 151 -
3.7.3.2 Filter
The "filter" utility enables filtering of the horizon using either median or average fil-
ter. The inline and crossline step-out should be defined. The larger the step-out,
the smoother the result of the filter.
- 152 -
3.7.3.3 Snap-To Event
In case the horizon is not correctly snapped to a seismic event, this option can be
used. The user should define the input data, the event type (peak or trough, zero-
crossing etc.), the search gate relative to the original horizon, and whether the
snapped horizon should be saved as new or overwrite the original horizon.
- 153 -
3.7.3.4 Variogram
For any horizon data displayed on a horizon, a horizontal variogram can be com-
puted:
The variogram describes the spatial continuity, here in the horizontal direction but
it can also be computed vertically from the crossplot tool. It is commonly rep-
resented as a graph that shows the variance in measure with distance between all
pairs of sampled locations. Modeling of relationship among sample locations to
indicate the variability of the measure with distance of separation is called
Semivariogram or Variogram modeling. Variograms are important when doing
inversion as it allows to predict a value at a location where it has not been meas-
ured.
- 154 -
Once the variogram has been created, the analysis consists in finding the model
that best fits the measured data in changing the variogram type and changing the
sill and range:
- 155 -
- 156 -
3.7.4 Display Contours
Add Contour Display: This option displays the contour on the horizon. That the
contour step (interval) is automatically calculated but can be edited at any time.
The input for the contour display can be either a reference Z or any surface attrib-
ute like Similarity, Energy, Dip etc ...
- 157 -
The images below show a horizon with both reference Z and Similarity contours,
respectively:
- 158 -
- 159 -
3.7.5 Calculate Isopach
Calculate isopach: This option will compute the time or depth difference between
two horizons. The computed grid will be displayed as a new layer on this horizon
and may be stored as a surface data. The output will always be in seconds,
meters, or feet.
- 160 -
3.7.6 Flattening
Write Flattened Cube: It creates the flattened seismic at specified time value of
horizon. The output is stored as a new flattened cube. The user can choose the
benefit of this option by flattening the cube at the horizon.
- 161 -
Create flattened scene: This option enables the user to create a second scene in
which the data is displayed relative to the flattened horizon. This can be a very use-
ful tool in specific situations. By flattening a horizon, the user gets an idea of the
approximate section at the time of the deposition of this horizon. The tectonic his-
tory can be derived from the difference between the original section and the
"restored" section. Another advantage of flattening the horizon is that it becomes
easier to evaluate the depositional environments.
Unflattening the cube: Should you need to unflatten the cube then please refer to
the following: Delta Resample Attribute
- 162 -
3.7.7 New Horizon
3D Horizon > New right- click menu in the Tree allows user to create new
3D horizons using one of the following:
- 163 -
3.7.7.1 Auto and Manual Horizon Tracking
3D horizon interpretation in OpendTect can be conveniently started from the right-
click menu of 3D Horizon > New > Auto and Manual Tracking in the Tree of either
3D scene or 2D viewer.
3D Auto-tracking
3D Auto- tracking is the primary, highly interactive workflow for horizon inter-
pretation in OpendTect. The user starts with a few picked seeds, auto-tracks in
volume, interactively QCs, as needed re-tracks with updated parameters and/or
edits, and locks QC-ed interpretation. Then interpretation continues by iteratively
repeating the steps. The advantage of this workflow is that the horizon is QC-ed
while interpreting and, therefore, saves time on editing. Any remaining holes can
be filled at a later stage using one of the gridding algorithms.
The workflow operates in the 3D scene via Horizon Tracking Settings window, Ctrl
+ right-click menu, and/or keyboard short keys (Shift + ? to see all).
- 164 -
, and/or
- 165 -
A seismic event in 3D volume is tracked starting from user-picked seed locations
following user-set rules (Horizon Tracking Settings window) on:
Section Auto-tracking
Manual drawing
This option is used to manually pick horizons in areas where auto-tracking is not
feasible.
- 166 -
3.7.7.1.1 Horizon Tracking Settings
3D horizon interpretation in OpendTect can be conveniently started from the right-
click menu of 3D Horizon > New > Auto and Manual Tracking in the Tree of either
3D scene or 2D viewer.
l Section Auto-tracking
l Volume Auto-tracking
l Manual Drawing with and without snapping
1. Auto-tracking mode:
l Recommended: Left-click to add seeds (seeds are stored with a horizon).
l Optional: hold Left-click and draw along the section, seeds will be automatically
added.
l Ctrl + Left-click to remove seeds.
2. Mouse draw:
l Recommended: hold Left-click and draw along the section to create an individual
l Hold (Ctrl + Left-click) and drag to erase interpretation along the line.
Right-click 3D Horizon in the tree and select New > Auto and Manual Tracking.
This will launch Horizon Tracking Settings window which contains several tabs.
Optionally, the dialog can be retrieved by right clicking on the horizon in the Tree
and choosing Tracking > Change Settings.
Mode Tab
- 167 -
Choose the tracking mode:
auto-tracking.
l Adjacent parents utilizes the last known trace positions to compare amplitudes to
Event Tab
- 168 -
Event Tab contains the defining parameters for Section / Volume auto-tracking.
l Input data: The input data is automatically selected when you pick on a section. This
can be the original seismic volume, or a filtered seismic volume (preferred) or any
other attribute. The horizon is linked to this input seismic. you can change the input
seismic at any time: it won't change your saved interpretation. If you change the data
during interpretation, the next click on the new data may show a warning message. If
you continue, then the new data will be used to do auto-tracking.
l Event type: Specify the event type you want to pick. The tracker can track negative
reflectors (Min), positive reflectors (Max), a Z-type zero-crossing (0+-), or a S-type
zero-crossing (0-+). If the tracking does not seem to work, check that the event type
corresponds to the event you actually interpreted the seed(s). If Seed Trace method is
chosen then one can interpret a mixed phased horizon.
l Threshold type:
l Cut-off amplitude: Here, an absolute amplitude is used as the stopping criteria for
the tracker. When the tracker encounters a value below this threshold value it stops
tracking. (For a max-event the tracker stops if the value is below this threshold
value, and for a min-event when it is above this threshold value). Tip: point your
mouse at the event and the amplitude value is displayed at the bottom of your
screen.
l Relative difference: The tracker will compare the amplitude of the last tracked point
to the amplitude of the point that is candidate for tracking. If the difference exceeds
- 169 -
the chosen percentage, the tracker stops tracking. Further explanation on Steps is
given below.
l Search Window: The tracker searches for the chosen event type based on amplitude
in a time window relative to the last tracked sample.
l Display Window: This controls the display of WVA/VD of the surrounding trace seg-
ments near the last picked seeds. One can control the display window and number of
traces. This gives an overview of the picked location. In the display, one can also
change the search window by moving the green lines up/down.
At the bottom of this tab, one may see a status note on picked/current data on
which interpretation is being performed.
Correlation Tab
A trace segment around the last tracked point is compared to all the trace seg-
ments on the neighboring traces around the points that lie within the Search win-
dow (See figure below).
In this tab, one can do auto-tracking by turning the correlation ON. The correlation
window should be small enough to ensure a waveform. The threshold is generally
strict (80% or higher) to track only high quality amplitudes. The correlation window
size can be changed by moving the green lines in the display window.
- 170 -
Properties Tab
This tab is used to adjust the display properties of the horizon and corresponding
seeds. Optionally, you can also change the color codes for various parts (parent
paths, selection of a horizon if you intend to remove through a polygonal selection,
locked tracking areas) of the horizon.
- 171 -
A SteeringCube can be added as a constraint for horizon interpretation. This will
improve the horizon tracking especially in the areas of dipping reflectors. Dip steer-
ing gives structural information.
3D Auto-Tracking Menu
- 172 -
Using this menu, you can control several tracking features.
Tracking Workflow
After adjusting the parameters in the tracker setup (which can remain open during
tracking), start picking seeds on a displayed inline/crossline.
Auto-tracking in a 3D volume
- 173 -
7. If the tracking result is good, you may lock (CTRL + right click) the results so that
they are not changed. You may unlock the old tracking paths if you want to change
them during interpretation.
8. Otherwise, you add more seeds and choose re-track from seeds .
9. You can optionally, remove the wrong paths at a mouse location by using
CTRL + right click menu (Select parent / children paths and relaunching the menu to
remove the selection).
10. Another option is to launch a 2D Viewer of parent paths to edit auto-tracking errors. If
you do that, then you may have to re-track from seeds again.
- 174 -
3.7.7.2 Create Horizon With Constant Z
3D horizon with constant Z-value can be created from the right-click menu of
3D Horizon > New > With constant Z.
- 175 -
3.7.8 Shift
Shift: The scrollbar allows the user to scroll the 3D horizon vertically. The shift
range allows the user to define the upper and lower boundaries of the scrollbar
range. The step size defines the distance between each possible horizon position.
(e.g. A range of -100 to +100 with a step of 10 allows for the user to scroll through
20 possible horizon positions, centered about the original position.) Different attrib-
utes can be calculated for the horizon in this user defined shift range. The user can
then use the scrollbar to move up and down and view the attribute as it would
appear on that horizon at the various shift positions. This shifted horizon can be
saved as surface data to be viewed later.
- 176 -
3.7.9 Calculate Volume
Calculate volume: It is used to calculate the volume between the two horizons.
The volume is calculated within an existing polygon. Select the polygon and press
Estimate Volume button to calculate the volume within the polygon. To read more
about this, please go to the chapter pointset: Pop-up Menus
- 177 -
3.7.10 Other Options
Properties: The Material window allows changing of the graphical settings like
transparency, line style, and thickness.
Horizon default resolution and colortable settings can now be defined under the
'Horizons' tab via Utilities--> Settings--> Look and Feel....
Quick UVQ: This option is related the Neural Network plugin license, if it is avail-
able. It is used to create a quick unsupervised facies map. For further information
please refer to the plugin documentation.
Use single color: When this option is selected, the horizon is displayed in a single
color, which can be chosen from a standard color selection window.
Tracking: Horizons can be edited and tracked through the survey. The various
tracking options are described in here.
Save: The save option gets highlighted when changes are made to the surface
geometry. Save saves the new geometry of the horizon. If a horizon consists of
patches, you can save a sub-selection of these patches.
Save as: Save a sub-area or the complete horizon using an other name.
- 178 -
Position: It is used to re-position (selected inline/crossline range) the displayed
horizon. In the position dialog, set the ranges of the inline or crossline to sub-select
the horizon display.
Lock: This will lock the selected object. It prevents accidental removing, moving,
or displaying data on the object. After clicking unlock, all manipulations are pos-
sible again.
Remove: This option removes the horizon from the tree and the graphics area.
- 179 -
3.7.11 Store Z as Attribute
This option gives the possibility to store 'Z' values as a Horizon data for an horizon.
Subsequently, this newly created attribute can be used to change 'Z' values of
another horizon by means of Set Z values.
The name of the new 'Z' attribute and in which units it will be saved need to be spe-
cified.
- 180 -
3.7.12 Set Z Values
A 'Z' value surface attribute (see Store Z as attribute) can be used to shift a horizon
or completely change its 'Z' positions using the Set Z values option.
Specifying values as Relative (deltas) will shift the horizon; infact the software
adds the attributes 'Z' values to the 'Z' values of horizon to achieve this shift. Abso-
lute is used while completely changing the 'Z' values of the horizon to the 'Z' val-
ues of the surface attribute. Specification of units of 'Z' values (i.e. in 'milliseconds'
or 'seconds') is also required.
- 181 -
3.8 2D Horizon
Once you have selected a 2D horizon, two other options become available:
- 182 -
The selected horizon(s) will be displayed in the scene. To start a new 2D horizon
interpretation, read the chapter How to interpret Horizons.
- 183 -
3.8.1 Display
Properties: Change the display settings for a horizon (color, reflectivity, line style).
Only at sections: Display the tracking horizon at section. This is especially usefull
for QC purposes to check if the tracked horizon lies on the expected reflector. Can
be toggled back to In full
- 184 -
3.8.2 Tracking
- 185 -
3.8.3 Tools
Snapping: This option allows for the selected 2D horizon to be 'snapped' to the
nearest event defined in the Event option (see below).
- 186 -
Horizon before snapping(red), after snapping (green)
Keep holes larger than: By checking this box, the gridding interpolation area can
be defined i.e. by defining a threshold e.g. 2500m. By setting this value, the grid-
ding is restricted and gaps/holes up to a radius of 2500 meters will be filled.
Output horizon: Overwrite or create a new horizon from the selected horizon .
- 187 -
3.8.4 Workflows
Create Flattened Scene: The 2D line(s) can also be flattened using this option.
Once clicked, it will create a new flattened scene based on the selected 2D Hori-
zon. (see also: Flattened Horizon Scenes)
- 188 -
3.9 Fault
The fault option enables interpretation of either a new fault or loading an existing
one.
Add: Adds selected faults into the tree and displays them in the scene:
- 189 -
New: Adds an empty fault in the scene (New fault 1) that needs to be named and
saved once the interpretation is completed.
Display all: If more than one faults have already been displayed or added in the
tree, this option will be available. It is used to display all faults in full, only at sec-
tions, or at horizons, or both. It is also used to toggle On/Off the fault plane, sticks
and both displays.
Show all items: It is used to check all items, which means that all items would be
displayed in the scene.
Hide all items: It is used to hide all (check out) the displayed faults.
Remove all items: To remove all faults that are added in the tree, this option is
used.
- 190 -
- 191 -
An example of a picked fault line on a seismic section:
Once a fault has been added, right-clicking will pop-out the following menus :
- 192 -
Add
Attribute: Add a new attribute for fault element. Right click and choose 'Select
attribute' to select the desired seismic volume. The attribute will be displayed along
fault planes. The example line with interpreted faults in a 3D volume has been
shown below. Note that the faults have seismic data displayed as an attribute
along their planes.
Volume Processing Attribute: It is used to add a special sub layer to the fault that
belongs to volume processing attribute. To read more about this, please go to the
Volume Builder Setup chapter.
Display
Histograms: It shows multiple histograms of the displayed data along the selected
fault plane.
Only at horizons: To display a fault plane on a horizon as a fault trace, this option
could be toggled On/Off.
Fault planes: If a fault has been displayed either on a section or a horizon, it can
be back into a 3D fault plane. This option toggles On a fault plane display.
Fault sticks: To see the fault sticks only in 3D, this option should be toggled on.
Use single color: It sets a single color to a fault plane display. Any displayed attrib-
ute along the fault plane will become hidden an only the fault color would be dis-
played.
Properties: Set the Type, Size, and Color of the point markers on the graphics
area.
Save As: To save the selected fault with a new name, this option is used.
Lock: Lock the selected object. Prevents accidental removing, moving, or dis-
playing data on object. After clicking on Lock again (i.e unlocking), editing is
enabled.
- 193 -
Remove: It removes the selected fault from the scene.
Select Attribute: It is used to select and display various types of data(see below).
Stored Cubes: Any stored volume could be displayed along the fault plane in 3D.
Attributes: Any attribute defined in the Attribute set window could be displayed.
This requires a pre-defined attribute set in the Attribute set window. It will be inact-
ive if no attribute is defined in that window.
Save Color Settings: The active color table could be stored permanently or
updated for the displayed stored attribute. For instance, if you do not like the color
bar for a particular seismic data (say PSTM) that is Red-white-blue (color table)
and you want to change it into Magic, you could set it here. It will save the colour
settings for this specific stored volume (PSTM).
Move: To change the display level of an attribute, it can either be moved up / down
or placed to top / bottom.
Display: To make a fault semi-transparent, the transparency is used. One can also
visualize the histogram of the attribute.
- 194 -
For Fault interpretation, please see the interpret faults chapter
- 195 -
3.10 FaultStickSet
A FaultStickSet is a set of sticks for faults interpretation. Sticks are segments that
are created by connecting two or more nodes.
The FaultStickSet tree item allows the user to create a new FaultStickSet or to load
an existing one.
The new FaultStickSet is inserted by selecting the New option in the tree. The
blank fault New sticks 1 will be inserted as sub-element of FaultStickSet.
The user can then interpret the fault sticks on inline/crossline/Zslice and/or on 2D
lines as well.
In order to create and edit a faultstickset, check first that the faultstisckset is active
in the tree, then do the following:
- 196 -
1. Click along the fault to create your first fault stick for one specific section .
2. The second fault stick in the same section is created by shift + leftclick for the first
point then just leftclick for the next faultstick(s)
3. To remove a fault stick node, Ctrl+leftclick on the already picked nodes.
4. Once you are done with one section, move to another inline/crossline/timeslice/ or 2D
line to create new fault sticks. A simple click will start the fault stick creation.
5. If you want to edit one stick while being busy with another, just click on one of it nodes
to make it active. While editing, you can click and drag a node to another position.
After interpreting the FaultStickSet, use the option Save to save your set with an
appropriate name.
An example of a picked FaultStickSet with nodes, the active stick is the second
from right (the node connecting line is thicker)
- 197 -
3.10.1 FaultStickSet to Fault
In OpendTect, newly interpreted faultsticksets (or a selection) can be transformed
into 3D faults and vice versa, from 3D faults the user can output faultsticks.
In the toolbar, they are two modes: The Edit mode and the Selection mode:
Edit Mode: In this mode, nodes are yellow, the user can add nodes (click), remove
nodes (Ctrl+click). Nodes can be dragged from one location to another. New sticks
are created by Shift+click for the first node then just click for other sticks.
When using Space+click, this will duplicate the node(s) and new sticks can be
added to the user-defined direction.
- 198 -
Nodes are yellow in Edit Mode
Selection mode: When this mode is active, Faults/Faultsticks are selected, copied
(or moved) to new or already existed faults/sticks group. The outputs are: New
group, merge to existing one, replace (overwrite) the already existed group.
When converting faultsticks into faults, please keep in mind that OpendTect
doesn't support the files that contain (1) Crossing fault sticks, (2) Fault sticks inter-
preted on vertical (e.g. inline) as well as horizontal (e.g. Z slice) planes. If the input
- 199 -
file contains such type of stick sorting, you might encounter problem in OpendTect
to get a regular fault plane.
Clicking the icon allows you to set the transfer (or conversion) settings which
will be applied after the copy or move is put into action:
- 200 -
3.11 Body
Bodies are displayed and created from this tree item. Using the option "New poly-
gon body" the bodies can be drawn by picking on vertical and horizontal slices.
The body will always be the convex envelope around the picked locations.
Bodies may be used for display but also the creation of volumes using the volume
builder: The inner and/or outer parts of the body are filled with constant value(s).
- 201 -
- 202 -
3.12 Well
Clicking the well-element in the pops up a menu with 3 options: Add, Tie Well to
Seismic and New WellTrack
Add: Wells are added and displayed in the scene using Add option.
- 203 -
Tie Well to Seismic: Access the seismic to well tie module. Generally, three para-
meters are needed for a successful well-seismic tie: sonic/velocity log, density log
and a reference wavelet. The wavelet can be either imported or extracted in
OpendTect. Logs can also be created in the Well Manager.
New Well Track: Create new well tracks interactively in the 3D scene. After select-
ing this option the system will prompt for a well track name. After specifying the
well track name, display an element (inline/crossline/2D line) in the scene. Draw-
ing the well track on the selected element is enabled. After drawing the well track,
right click on the well track name and select the Save option. Note that drawing a
new well track works similarly to editing a existing well track. Well track-nodes can
be picked on the active elements displayed in the scene. Also note that a display
with a Z-scale (View - Z-scale) other than 1 distorts the appearance of distance in
the 3D view.
After loading new wells, items are added to the right-click menu as follows:
- 204 -
Well popup menu (multiple) and menu for individual wells.
- 205 -
Multiple Well Options
These options are available only when more than one well is loaded in the tree,
and can be accessed by right-clicking Well in the tree. The new options available
when multiple wells are loaded. Items described in the previous sections above
will not be described again here.
Create Attribute Log: creates selected seismic data as a log for multi-wells.
Constant Log Size: keeps a well log display width relative to a scene zoom ratio
i.e. a log display width increases with the zoom in and vice versa. However, this
- 206 -
option can be toggled off by clicking on the sub menu item (Basic Well Pop-up
Menus). In the later case, a log display width is adjusted opposite to the zoom i.e. if
a scene is zoomed in, a log display width is reduced relative to the scene zoomed
in ratio and vice versa.
Show all: allows the user to toggle on all well names (top),well names (bottom),
markers, marker names, and logs.
Hide all: allows the user to toggle off all well names (top),well names (bottom),
markers, marker names, and logs.
Show all items: allows the user to toggle on all wells currently loaded and visible
in the tree.
Hide all items: allows the user to toggle off all wells currently loaded and visible in
the tree.
Remove all items: allows the user to remove all wells currently loaded and visible
in the tree. This only removes the wells from the scene, it does not delete them
from the disk.
Once a well has been loaded into the scene and is visible in the tree, right-clicking
an individual well pops-up a window with the following options:
Create attribute log: allows to create a new log by calculating an attribute along
the well track. A new window pops up where the attribute, log name, and the depth
range should be provided. The Depth range is defined as start depth, stop depth,
and sample distance.
Create log cube: enables to create a volume of a selected log. The log is duplic-
ated on a user-defined number of traces around the well location. More than one
log can be selected at once and one volume for each log will be generated. This
allows easier comparison between well logs and seismic data.
Properties: sets various display settings of a well track, the logs, and the markers.
The properties can be set for each well and can also be updated for all wells dis-
played in a scene. The later can be done using the button Apply to all wells avail-
able in the Well Display Properties window.
- 207 -
- 208 -
- 209 -
Log display, Markers display and Track display properties
Well Log Properties: In a scene, the log are displayed using the Left Log and
Right Log tabs. The logs are displayed on the left and/or on the right of a well track
according to a current view. The log properties include the log selection, log range,
fill color and the the thickness of the log line. None refers to no log selec-
tion/display. If the logs are already imported , the Select log should contain the
- 210 -
name of the logs in the drop down list. The data ranges and the color ranges are
updated automatically from the selected log. However, both fields are editable.
Two types of log displays styles are supported. For a standard log trace display
style Well log radio button is selected. For a wiggle display, the Seismic radio box
is selected. The well logs can be filled with any selected color table. The color
ranges can also be manually set/clipped. However, the seismic style contains dif-
ferent settings. The synthetic seismic traces can be displayed by toggling the Seis-
mic radio box ON. The seismic traces can be repeated by specifying the repetition
numbers in the spin box adjacent to the Repeat text. The Overlap field refers to the
overlap percentage of the repeated traces. Optionally, dual logs spectra can be dis-
played together on the same side by displaying one log as a trace and filling the
color with another log (Fill with Log).
Well Track Properties: The track properties are modified in this tab. The track line
thickness is changed by scrolling the Line Thickness spin box. The well track-
/name color is updated by pressing the colored button. The well name can be dis-
played above and below the track. The name size can also be increased or
decreased. It may be noted that the name size is adjusted relative to the 3D zoom.
Well Marker Properties: Well marker properties tab include the settings for
marker's name size, color, shape (3D), etc. The marker size is adjusted using the
spin box (up/down). The limits for the size are set from 1 to 100. The color of all
markers of a well track can be changed in to a one unique color. This is supported
by the 'use single color' option. If the same color is to be assigned to all available
well markers, set check to this field and select the color. Additionally, three different
3D shapes are supported (Cylinder, Sphere, Square). The cylindrical shape is
added for orthographic camera displays, which is better for the visualization pur-
poses. The height of a cylinder is supported.
Edit Welltrack: Allows you to add or delete nodes to the well track. Deleting nodes
is done by holding CTRL and clicking a node. Adding nodes is done by making
"node points" on any of the active elements on your screen. Remember that the Z-
scale caused a vertical stretch, distorting the appearance of real distance in the 3D
view.
2D Log viewer: This property allows you to display a well log in a 2D scene. The
display in log viewer is driven by the 3D scene. To interactively display different
logs, go to the well properties and make the desired changes.
- 211 -
- 212 -
Save: Stores a new well or saves the changes that were made to an existing one.
Provide a name for a new well, and if a depth to time model is available, select the
file. The file should have the same format as when importing a welltrack. Option-
ally, you can examine the file using the corresponding button. Specify if the model
uses TVDSS or MS, also the measurement units.
Lock / Unlock: Locks the selected object. This prevents accidental removing, mov-
ing, or displaying data on the object. After clicking unlock all manipulations are pos-
sible again.
Remove: Removes the well from the tree (not from disc).
- 213 -
3.13 Pre-Stack Events
This tree item allows you to display picked or imported prestack events in the 3D
scene. Note that this tree item is only display when there are prestack events in the
current project. Otherwise it is hidden.
- 214 -
The displayed points are always linked with a thin line. Regardless of the display
mode the points are colour-coded with respect to the following color settings:
l Single: Default mode, one single color for all points of the prestack events.
l Quality: The color of the points is related to its quality. This attribute is either imported
with the prestack event or set when picked in the Velocity Model Building plugin.
l Velocity: The color of the points is related to the corresponding interval velocity. Note
that for this to work the input prestack datastore and corresponding migration velocity
must be specified in the velocity model building plugin.
l Velocity fit: The color of the points is related to the deviation between the picked
event and velocity of the best fitting normal/residual moveout curve. Note that for this
to work the input prestack datastore and corresponding migration velocity must be
specified in the velocity model building plugin.
The color of the points, except in single mode, should be adjusted using the col-
orbar like with any attribute by ajusting the colorbar and amplitude ranges.
- 215 -
3.14 Pre-Stack 3D Viewer
Prestack gather selection from the tree
Right-click menu of a vertical slice tree item when prestack datasets are available.
- 216 -
Poststack data on an inline (left) with a prestack gather displayed perpendicular to
it.
- 217 -
Right-click options for a prestack dataset
Properties
There are several display properties available that are described below. Please
note that the gathers are first displayed without any processing. This can be set
together with other properties in the PS gather display properties:
l Shape: The shape tab will set the size of the gather with respect to other 3D elements
and its relative position
l Appearance: Color bar, amplitude ranges and grid lines can be set in this tab.
l Scaling: This tab is used to scale (clip) the amplitude range of the displayed data.
l Preprocessing: Pre-processing may be applied to enhance the display. The avail-
able algorithms are presented in the prestack processing chapter.
- 218 -
- 219 -
- 220 -
- 221 -
Resolution: Interpolates the data to get a better display (consumes more memory).
- 222 -
Amplitude spectrum: Display the average frequency spectrum of the trace of that
gather, for the displayed Z range.
- 223 -
3.15 Annotations
With this option, you can draw arrows, load images, and write text on the display
window by right clicking in one of the items in this tree.
Arrow: You can add new arrow groups, change the properties, lock-unlock, and
remove them by right clicking on this element.
Once you have added a new arrow group, named it and saved it, you can now
click in the scene to add arrows (CTRL+left-click to remove an arrow). The arrow
properties can then be changed by right-clicking on the newly inserted arrow group
and selecting the properties from the fold-out menu. In the arrow properties, arrow
type (top, bottom or both heads), color, width and size are adjusted.
Image: Once you have added a new image group, you can click in the scene to
add an image (CTRL+left-click to remove an image). It is then possible to store, res-
ize, change image, lock-unlock, and remove it by right clicking on the relevant
image group in the tree.
- 224 -
- 225 -
Add a new annotation group and changing the position of the annotation
The text group pop-up menu can be launched by right clicking on the text group
name.
l Size: Resize the text group. It may be noted that it will resize the all inserted text sub-
elements according to the new size.
l Save/Save as: Saves the text group in to an existing name/new name.
l Change Text: it is used to replace/change the text of the selected annotation. It may
be noted that in the tree pop-up menu it si inactive. A user can only change the text of
a selected annotation. It is done by right-clicking over the annotation in a working
scene.
l Background Color: Modifies the background color of the annotation.
l Lock: If lock is selected, it will prevent further modification of the group.
l Remove: It removes the group from the tree/scene.
Scale Bar: Use this option to add a scale bar to an inline, crossline or Z-slice.
Once added and saved, right-clicking on this element will also give you the option
to change the properties.
For Inlines and Crosslines, this includes the Horizontal or Vertical direction options
- 226 -
For Z-slices, this includes the X or Y direction option
- 227 -
4 Survey
The Survey module is used to select, create, modify, delete or copy surveys. A sur-
vey defines the geographical boundaries of an OpendTect project and relevant pos-
itioning information such as the relationship between inline/crossline and X/Y
coordinate systems. Each survey (project) stores its data in a separate directory
that needs to be specified along with the survey reference name.
- 228 -
4.1.1 Survey Selection Window
Select an existing survey from the list of surveys on the left or create a new one
with New ... (see below). The boundaries of the survey are depicted in the field
to the right and detailed in the information field . The Notes field is a free-
format text field to store relevant survey notes.
- 229 -
When you install OpendTect, selected an OpendTect data dir-
ectory where all your surveys are stored:
Any folder can be turned into an OpendTect folder, the only change being the addi-
tion of a parameter file (.omf).
Only surveys stored in the selected OpendTect folder are displayed and can be
accessed. Later you can open another OpendTect folder clicking on Survey Data
root. The current data root is always displayed on the top of the window.
Use for editing survey box ranges or update coordinate information (see Edit
Survey Window)
Allows you to compress/pack your entire survey into a zip file. This is highly
recommended when transferring your survey from a computer to another computer,
especially if they do not use the same platform. All data from this survey will be con-
tained in the zip file, with the exception of the SEG-Y and/or CBVS files that were
used 'in-place' from another location (ie: those SEG-Y or CBVS files that were
used but not actually put inside the survey folder)
- 230 -
Unpacks a previously packed survey (see above) into your Data root folder.
Most zip files could potentially be unpacked, but we support only the unpacking of
survey packed using the OpendTect packing tool. If you wish to share your survey
with the community, visit our Open Seismics Repository.
Takes the user to the Open Seismic Repository (OSR) page on the OpendTect
website. Here, one can find information on how to share surveys with the wider
community.
In the position conversion window there are two modes available for coordinate
conversion: Manual / File. In Manual mode, specify a inline/crossline pair, or a X/Y
pair, and press the corresponding arrow key to obtain the position in the other
domain. In File mode, browse the input file and create a new output file. By spe-
cifying the corresponding type conversion (XY to IC or IC to XY) and pressing the
GO button, the desired conversion is written to the output file. There is no specific
file type necessary for this input - even files without extension may be used. Simply
Select them and, if desired, Examine, too.
- 231 -
- 232 -
is used to export the selected survey boundary in a *.kml file, which is access-
ible via Google Earth. The dialog box contains the editable fields for the survey
box. The area of the survey box is filled with the selected color. The width is the
horizontal thickness of the survey outline. The border height is the altitude of the
line with respect to the ground. The Output file field is an output location of the
*.kml file. On 'Ok' the file (*.kml) is written at the specified path, which can be
opened directly in Google Earth.
- 233 -
Before exporting the *.kml file, specify the correspondence between X-Y coordin-
ates and latitude/longitude at any location in the surveybox :
- 234 -
4.1.2 New Survey Window
To launch the survey setup window select New in survey selection window. The
following window will appear on your screen:
Survey name: In the text area specify the OpendTect survey name.
Data to use: Toggle on the data type(s) to be included in the survey (2D only, 3D
only or both 2D and 3D)
Note: Select only 3D, if the survey contains only 3D type data set. Select both 2D
and 3D, if the survey contains both 2D and 3D type data set. If the survey contains
- 235 -
only 2D type data set, select only 2D. Selection type here affects the tree structure
and what functions are available to you in the survey.
Initial setup: Determines how you set up the survey ranges and coordinates:
Scan SEG-Y file(s): takes you to the SEG-Y tool to scan the file(s) for survey
setup.
Get from Petrel: allows you to copy the survey ranges/coordinates from another
software. (Available only with the Petrel Connector plugin.)
Set for 2D only: takes you to the following window where you can enter the work-
ing area values:
- 236 -
Copy from other survey: allows you to copy the survey setup from another survey
on your drive/network.
Manual selection: enter the values manually (see Edit Survey Window)
Domain: Can be in time or depth (for depth, define here the unit):
- 237 -
4.1.3 Edit Survey Window
To launch the survey setup window select Edit in the survey selection window.
The following window will appear on your screen:
Survey name: In the text area specify the OpendTect survey name.
Location on disk: Specify a directory on disk where the OpendTect survey would
be stored. The directory would be turned in to the OpendTect survey location.
Survey type: For the survey type, there are three options:
- 238 -
Select only 3D, if the survey contains only 3D data. Select both 2D and 3D, if the
survey contains both 2D and 3D data. If the survey contains only 2D type data set,
select only 2D. Selection type here affects the tree structure and what functions are
available to you in the survey.
- 239 -
4.1.3.1 Survey Ranges
The survey ranges are the inline, crossline and Z-range values. The ranges define
a 3D survey area for 3D seismic surveys and 2D grid area for 2D seismic surveys.
These fields can be filled manually, by scanning a SEG-Y file (2D/3D), using set
for 2D only option in Ranges/coordinate settings, or by copying the ranges from
another survey. If the Workstation Access plugin is available, one will see the Get
from GeoFrame or Get from Seisworks option in the drop-down menu.
The set for 2D only option is especially used to create a 2D seismic survey. Set the
average trace distance and the x and y coordinate ranges, and these will auto-
matically be translated into suitable survey settings.
Click on the Scan SEG-Y file(s) button to select a SEG-Y file. In the new window,
you set the SEG-Y settings, see also SEG-Y san section. Pressing OK will start
scanning the file(s). After scanning, you'll get a file report containing sampling info,
data statistics, and the survey setup. The Survey ranges and Coordinate settings
will be filled in automatically.
The Z range is specified in milliseconds, meter, or feet. The steps are incremental
Z-steps of the survey i.e. the seismic sampling rate.
- 240 -
4.1.3.2 Coordinate Ranges
The relationship between inline/crossline and X/Y can be specified in two ways.
The easy way is to specify three points, two of which must be on the same inline.
Due to rounding off errors, this method may not be 100% accurate.
In the Advanced option, the exact transformation from one coordinate system to an
other can be specified. The Apply button can be used to verify results graphically
and to check the coordinate transformation formula.
Starting in version 6.2 it’s possible to set an orthogonal coordinate system for each
OpendTect survey. At this point the following features are supported:
- 241 -
4.1.3.2 Coordinate Ranges
OpendTect supports only orthogonal coordinates. Coordinate System of an
OpendTect project can be defined in one of the three ways:
- 242 -
Project Based System can be selected when a coordinate system is known. The
list of projections was created using Proj.4 filter function.
- 243 -
In the Convert Geographical Positions window, there are two modes available for
coordinate conversion: Manual/File.
l In Manual mode, the user specifies an X/Y pair (or Lat/Long pair), then press the cor-
responding arrow key to obtain the position in the other domain.
l In File mode, the user browses the input file and create a new output file. By spe-
cifying the corresponding type conversion (XY to Lat/Long or Lat/Long to XY) and
pressing the Convert button, the desired conversion is written on output file.
- 244 -
4.2 Session
The OpendTect session is generally used to save and to retrieve the specific set-
tings of a scene. This helps to resume work from previous settings. The session
will save all settings of the displayed elements, and can be restored at any later
time. When clicking the Survey option in the tool bar and then click Session, three
options appear. It is possible to save the session or restore a previously saved ses-
sion. When clicking Auto, the session will restore itself automatically the next time
you start OpendTect.
When a session is saved, the system stores all element positions and relevant
information to recreate the images. The content of the elements is not saved but is
re-created when the session is restored.
- 245 -
The auto-load window (left) and the 'Select' option (right)
The user can enable or disable the auto-load session option. It is also possible to
choose if one of the save sessions will be used in this session. Finally the user has
the choice on whether or not to load the selected session now.
As mentioned earlier that the contents of the elements are not saved but are recre-
ated. It is a common practice of the OpendTect user(s) to save and restore a ses-
sion. The mistake a user(s) can normally make is to save a session with the
contents of an element(s) (e.g. attributes) that takes a long time to compute. In this
way, when such session is restored, it will take a way too long time to restore,
because the session can only store the settings (or relevant information) but not the
on-the-fly attributes. Thus, it re-calculates the contents. This can be avoided by cre-
ating the attribute outputs of such attributes. If an attribute already reside in a disk
(a session is saved), the session will be restored very quickly. Similarly, the same
thing can happen in a session that contains contents of surface data (the attributes
- 246 -
calculated along horizon). The attributes applied along a horizon can be saved as
a surface data. It is recommended, to save the surface data before saving a ses-
sion.
- 247 -
4.3 Import
The Survey > Import drop-down menu is used to import data to OpendTect.
Direct data exchange with Schlumberger's Petrel is available via PetrelDirect plu-
gin (part of OpendTect Pro)
- 248 -
4.3.1 Import Attributes
An OpendTect attribute set can be imported via Survey > Import > Attribute >
ASCII....
An OpendTect attribute set file contains a set of attribute definitions created in the
Attribute Set window. OpendTect attribute sets are stored in ../'Survey Data Root
folder'/Attribs/ and have '.attr' extensions.
In the Import Attribute Set window: locate an OpendTect attribute set, and provide
an Output Attribute Set name to be used in the current project.
The imported attribute set can then be opened within the Attribute Set window.
More options for importing attribute sets are available in the Attribute Set window
itself.
- 249 -
4.3.2 Import Color Table
An OpendTect color table can be imported via Survey > Import > Color Table... or
Manage Color Tables window > Import button.
l Other user option is used when it's possible to browse to the other user's home dir-
ectory. Navigate to the folder and type in DTECT_USER name (if any).
l Choose the File option if other users' home directories are not accessible. The color
tables created by OpendTect users are stored in a settings_coltabs.DTECT_USER
file (DTECT_USER = OpendTect username) that is located in the user's home dir-
ectory $HOME/.od/.
- 250 -
The default OpendTect color tables are stored in a ColTabs file that is located in
the OpendTect installation directory /root/OpendTect/6.2.0/data/.
- 251 -
4.3.3 Import Cross-Plot Data
Select the input Ascii file. You may display the input file by pressing the Examine
button. The input file should be column sorted with one point per record (line).
The main work is to specify the presence of a file header and the file format defin-
ition. The header, if present, can be of fixed length (number of lines), or delimited
on its last line by a keyword.
- 252 -
Predefined and saved file formats are available by pressing the icon. Otherwise
the format must be manually specified. The Define button gives access to the
format definition window.
- 253 -
You must specify in the format definition window the column numbers for the pos-
ition, in terms of an X-Y-Z or an inline-crossline-Z. The Z units can be seconds, mil-
liseconds or microseconds (meters of feet in depth surveys). All other columns with
be treated as amplitude data referenced with respect to the given position. The first
row may contain either the first vector with its position and the corresponding amp-
litudes ("Data"), or the name of the attributes in each column ("Column names").
Reading may be stopped at a specific line by providing the adequate keyword.
It is recommended to save the format definition for a later use and QC, by clicking
on the icon . In pop-up window, write the name of the format and store it. The
format can be stored at different levels (All surveys, Current survey, Current
OpendTect user level) depending on the usage. Press OK when done.
- 254 -
4.3.4 Import Faults
3D Faults (planes) can be imported in OpendTect via Survey -> Import -> Faults,
from Ascii files or from GeoFrame Workstation (plugin).
Faults are non-editable objects that may be used as display element in the 3D
scene, displayed in full or a section. Attributes can also be applied along faults. If
you are looking for an editable object that can be converted at a later stage into a
fault plane, please load your data as fault stick sets.
- 255 -
4.3.4.1 Import Fault Ascii 3D
Select the input Ascii file. You may display the input file by pressing the Examine
button. The input file should be column sorted with one point per record (line).
The main work is to specify the type of data, the presence of a file header, and the
file format definition.
The sticks composing the planes can be either gathered automatically, either from
picked slices (inlines or crosslines), and/or based on their slope. The sorting can
be done based on the geometry of the fault sticks, on an index written in the input
file, or in the order found in the file. The header, if present, can be of fixed length
(number of lines), or delimited on its last line by a keyword.
Note: that OpendTect does not support crossing fault sticks (a fault plane cannot
cross itself). If faults were picked on inlines, crosslines and horizontal slices, only
the largest subset of the three will be used to import the faults.
Predefined and saved formats are available by pressing the icon. Otherwise the
format must be manually specified. The Define button gives access to the format
definition window.
- 256 -
You must specify in the format definition the column numbers for the position; in
terms of an X-Y pair, point column, and optionally stick index (0 = no stick index).
The Z units can be seconds, milliseconds or microseconds. Reading may be
stopped at a specific line by providing the adequate keyword.
If Coordinate Reference System (CRS) is defined for the survey, CRS conversion
will be available in the import window.
- 257 -
It is recommended to save the format definition for a later use and QC, by clicking
on the icon . In pop-up window, write the name of the format and store it. The
format can be stored at different levels (All surveys, Current survey, Current
OpendTect user level) depending on the usage. Press OK after having provided
the name of the fault to be imported.
- 258 -
4.3.5 Import FaultStickSets
Fault stick sets are the editable version of the fault planes. Fault stick sets are fully
editable objects either for faults interpretation , or later as fault input to correct for
the fault throw. They can be imported in OpendTect via Survey > Import >
FaultStickSets, from Ascii files of from GeoFrame Workstation (plugin).
- 259 -
4.3.5.1 Import FaultStickSets Ascii 3D
Select the input ASCII file. You can display the input file by pressing the Examine
button. The input file should be column sorted with one point per record (line).
The important point is to specify the presence of a file header and the file format
definition. The header, if present, can be of fixed length (number of lines), or delim-
ited on its last line by a keyword.
Predefined and saved formats are available by pressing the icon. Otherwise the
format must be manually specified. The Define button gives access to the format
definition window.
- 260 -
You must specify in the format definition the column numbers for the position, in
terms of an X/Y pair, point column, and optionally stick index (0 = no stick index).
The Z units can be seconds, milliseconds or microseconds. Reading may be
stopped at a specific line by providing the adequate keyword. If Coordinate Refer-
ence System (CRS) is defined for the survey, CRS conversion will be available in
the import window.
It is recommended to save the format definition for a later use and QC, by clicking
on the icon . In pop-up window, write the name of the format and store it. The
format can be stored at different levels (All surveys, Current survey, Current
OpendTect user level) depending on the usage. Press OK after having provided
the name of the name of the faultstickset to be imported.
- 261 -
4.3.5.2 Import FaultStickSets Ascii 2D
Select the input Ascii file. You may display the input file by pressing the Examine
button. The input file should be column sorted with one point per record (line).
The main work is to specify the presence of a file header, and the file format defin-
ition. The header, if present, can be of fixed length (number of lines), or delimited
on its last line by a keyword.
Predefined and saved formats are available by pressing the icon. Otherwise the
format must be manually specified. The Define button gives access to the format
definition window.
- 262 -
You must specify in the format definition the column numbers for the position, in
terms of an X-Y pair, point column, and optionally stick index (0 = no stick index).
The Z units can be seconds, milliseconds or microseconds. The name of the 2D
line(s) must also be provided. Reading may be stopped at a specific line by provid-
ing the adequate keyword. If Coordinate Reference System (CRS) is defined for
the survey, CRS conversion will be available in the import window.
It is recommended to save the format definition for a later use and QC, by clicking
on the icon . In pop-up window, write the name of the format and store it. The
format can be stored at different levels (All surveys, Current survey, Current
OpendTect user level) depending on the usage. Press OK after having provided
the name of the faultstickset to be imported.
- 263 -
4.3.6 Import Horizons
Horizons interpreted on 3D and 2D seismic data and (attribute) grids can be impor-
ted in a OpendTect survey via Survey > Import > Horizons. The grids are called
"Surface data" in Opendtect and are attached to 3D horizons. Horizon import sup-
ports the following:
The standard input data is Ascii files. Three options are available (explained in the
following subsections):
- 264 -
4.3.6.1 Geometry 3D
Select the input ASCII file. You may display the input file by pressing the Examine
button. Available grids (attributes) present in the input file may also be imported
semultaneously. The input file should be column sorted with one point per record
(line).
The main work is to specify the presence of a file header and the file format defin-
ition. The header, if present, can be of fixed length (number of lines), or delimited
on its last line by a keyword.
- 265 -
Predefined and saved file formats are available by pressing the icon. Otherwise
the format must be manually specified. The Define button gives access to the
format definition window.
You must specify in the format definition window the column numbers for the pos-
ition, in terms of an X-Y pair or an inline-crossline pair, and the point column.
Points that should not be read must all have the same numerical value, which is to
be filled in as the "Undefined value". The Z units can be seconds, milliseconds or
microseconds. Optionally, if attributes were added in the Import Horizon window,
additional columns with given attribute(s) name(s) will also appear in this format
definition window. Reading may be stopped at a specific line by providing the
adequate keyword.
If Coordinate Reference System (CRS) is defined for the survey, CRS conversion
will be available in the import window.
It is recommended to save the format definition for a later use and QC, by clicking
on the icon . In pop-up window, write the name of the format and store it. The
format can be stored at different levels (All surveys, Current survey, Current
OpendTect user level) depending on the usage. Press Ok when done.
- 266 -
It is highly recommended to scan the input file after providing its format, and prior to
the actual import. The scanned information will pop-up and error(s) or warning(s)
may suggest a change of the format definition.
The option Fill undefined parts will be toggle on if gaps were found during scan. A
triangulation to the convex hull with an interpolation where the maximum size is
the input grid step (in XY units, thus meters or feet), and Keep holes larger than
toggled off, should be the optimal settings in most cases.
Tied to level is additional option specifically designed to tie horizons to well mark-
ers, for correlation purposes. In order to define the stratigraphic information of the
survey, please read about Manage Stratigraphy.
- 267 -
4.3.6.2 Attributes 3D
This window is used to import grids from ascii files and attached them to Existing
3D horizons. Select the input Ascii file. You may display the input file by pressing
the Examine button. The input file should be column sorted with one point per
record (line).
Grid names must first be provided in front of Select Attribute(s) to import. This can
be done by pressing Add new right of it, and providing each time a new grid name.
- 268 -
This will populate the list of importable grids. Only the highlighted grids will be
imported, which is why each new grid is highlighted after providing its name.
Next, the presence of a file header must be specified and the file format definition
must be provided. The header, if present, can be of fixed length (number of lines),
or delimited on its last line by a keyword.
Predefined and saved file formats are available by pressing the icon. Otherwise
the format must be manually specified. The Define button gives access to the
format definition window.
You must specify in the format definition window the column numbers for the pos-
ition, in terms of an X-Y pair or an inline-crossline pair, and the grid(s) column(s).
Grid values that should not be read must all have the same numerical value, which
is to be filled in as the Undefined value. Reading may be stopped at a specific line
by providing the adequate keyword.
It is recommended to save the format definition for a later use and QC, by clicking
on the icon . In pop-up window, write the name of the format and store it. The
- 269 -
format can be stored at different levels (All surveys, Current survey, Current
OpendTect user level) depending on the usage. Please note that the full grid
names will be saved as provided in the format definition. Press OK when done.
It is highly recommended to scan the input file after providing its format, and prior to
the actual import. The scanned information will pop-up and error(s) or warning(s)
may suggest a change of the format definition. The area subselection is essentially
present to optionally reduce the amount of data to be imported, by reducing the
inline/crossline range(s).
Finally an horizon must be provided, to attach the grid(s) to it. Grids will be access-
ible only after having loaded this horizon in the tree. Press Go to launch the import.
- 270 -
4.3.6.3 Geometry 2D
This window is used to import 2D interpretations form ascii files. Select the input
ascii file. You may display the input file by pressing the Examine button. The input
file should be column sorted with one point per record (line).
Next, the presence of a file header must be specified and the file format definition
must be provided. The header, if present, can be of fixed length (number of lines),
or delimited on its last line by a keyword.
- 271 -
Predefined and saved file formats are available by pressing the icon. Otherwise
the format must be manually specified. The Define button gives access to the
format definition window.
You must specify in the format definition window the line name, column numbers
for the position, in terms of an X-Y pair or a unique trace number, and the horizon
(s) column(s). Horizon Z values that should not be read must all have the same
numerical value, which is to be filled in as the Undefined value. Reading may be
stopped at a specific line by providing the adequate keyword.
If Coordinate Reference System (CRS) is defined for the survey, CRS conversion
will be available in the import window.
It is recommended to save the format definition for a later use and QC, by clicking
the icon . In pop-up window, write the name of the format and store it. The
format can be stored at different levels (All surveys, Current survey, Current
OpendTect user level) depending on the usage. Press OK when done.
- 272 -
It is highly recommended to scan the input file after providing its format, and prior to
the actual import. The scanned information will pop-up and error(s) or warning(s)
may suggest a change of the format definition. Press Go to launch the import.
- 273 -
4.3.6.4 Bulk 3D
The bulk import tool allows for the import of multiple 3D horizons from one single
file. The data is matched by name. This has the following implications:
The horizon name must appear on each line of the input file. The horizon name
should not contain spaces, otherwise the matching with a given column number
will not work as expected.
Apart from being a multiple horizon import tool, it behaves following the rules of the
standard horizon import.
Format definition
You must specify in the format definition window the column numbers for the pos-
ition, in terms of an X-Y pair or an inline-crossline pair, the point column and the
horizon name. If Coordinate Reference System (CRS) is defined for the survey,
CRS conversion will be available in the import window.
- 274 -
- 275 -
4.3.7 Import Mute Functions
Mute definitions can be used for pre-processing prestack seismic data.
Mute definitions can be imported in OpendTect using Ascii files. The import win-
dow is launched from the OpendTect main menu (Survey > Import > Mute defin-
itions > Ascii). Select the input Ascii file. You can display the input file by pressing
the Examine button. The input file should be column sorted with one point per
record (line).
- 276 -
The main work is to specify the presence of a file header and the file format defin-
ition. The header, if present, can be of fixed length (number of lines), or delimited
on its last line by a keyword. The mute definition can be either variable throughout
the survey, in which case a position must be provided in the input file for all data
points, or fixed. In this latter case, toggle File contains position to No and provide
any location for the mute definition.
Predefined and saved file formats are available by pressing the icon. Otherwise
the format must be manually specified. The Define button gives access to the
format definition window.
- 277 -
You must specify in the format definition window the column numbers for the pos-
ition, in terms of an X-Y pair or an inline-crossline pair, and the point column, in
terms of an Offset-Z value pair. points that should not be read must all have the
same numerical value, which is to be filled in as the "Undefined value". The Z units
can be seconds, milliseconds or microseconds (meters of feet in depth surveys).
Reading may be stopped at a specific line by providing the adequate keyword.
It is recommended to save the format definition for a later use and QC, by clicking
on the icon . In pop-up window, write the name of the format and store it. The
format can be stored at different levels (All surveys, Current survey, Current
OpendTect user level) depending on the usage. Press OK when done.
- 278 -
- 279 -
4.3.8 Import Navigation Data / 2D Geometry
Import facility for 2D Navigation Data using ASCII files that contain position inform-
ation (e.g. X/Y or Lat/Long), Line name, Trace number and Shot Point number. You
may display the input file by pressing the Examine button.
Format definition
Predefined and saved file formats are available by pressing the icon. Otherwise
the format must be manually specified. The Define button gives access to the
Format definition window. If Coordinate Reference System (CRS) is defined for the
survey, CRS conversion will be available in the import window.
- 280 -
4.3.9 Import Pointsets & Polygons
Point/vector data can be loaded in OpendTect from Survey -> Import -> point-
set/Polygon.
Select the input Ascii file. You can display the input file by pressing the Examine
button. The input file should be column sorted with one position per record (line).
- 281 -
The main work is to specify the presence of a file header and the file format defin-
ition. The header, if present, can be of fixed length (number of lines), or delimited
on its last line by a keyword.
Predefined and saved file formats are available by pressing the icon. Otherwise
the format must be manually specified. The Define button gives access to the
format definition window.
You must specify in the format definition window the column numbers for the pos-
ition, in terms of an X/Y pair or an inline-crossline pair, and the point column. points
- 282 -
that should not be read must all have the same numerical value, which is to be
filled in as the "Undefined value". The Z units can be seconds, milliseconds or
microseconds. Reading may be stopped at a specific line by providing the
adequate keyword.
If Coordinate Reference System (CRS) is defined for the survey, CRS conversion
will be available in the import window.
It is recommended to save the format definition for a later use and QC, by clicking
on the icon . In pop-up window, write the name of the format and store it. The
format can be stored at different levels (All surveys, Current survey, Current
OpendTect user level) depending on the usage.
The option Import as polygon will flag this specific datatype to the loaded data. It
also adds as constraint during loading that the points are ordered in the expected
way. The import tool will not apply any sorting.
- 283 -
4.3.10 Import Probability Density Functions
Probability density functions can be imported in order to run Bayesian clas-
sifications. The manage tool can later be used to edit the PDF before running the
Bayesian classification.
- 284 -
RokDoc formatted data is required for importing PDF in OpendTect. After having
selected the input file, the two contained variables will be shown in the import win-
dow, together with their amplitude ranges and bin size.
The variable names and parameters may be modified before pressing the Go but-
ton that will launch the import. The icon to the right can be used to quickly
extend both variable ranges by one bin size outwards.
- 285 -
4.3.11 Import Seismics
Volumes and 2D Lines can be imported in the Survey menu from files in different
modes:
- 286 -
4.3.10.1 SEG-Y
SEG-Y is the standard way to share volumes/lines of data. In OpendTect the files
are loaded with a rigorous respect to the SEG standards using one of the two wiz-
ards:
Most SEG-Y files will be imported in a few clicks, and a number of exceptions can
be set to load the most problematic data. Nevertheless there are a few guidelines
that must be honored:
l The traces must be sorted either by inlines and then crosslines or by crosslines and
then inlines.
l The gathers of prestack data must be consecutive and ordered by increasing offset
(i.e. no common offset sorting).
l Inlines/crosslines or coordinates (and offset) must be written in every single trace
header. Separate navigation data is only supported for 2D lines.
l The traces must have a fixed length.
l There is no support of extended textual headers.
Once the above criteria are respected, you will then enjoy a large freedom:
Please, read the entire chapter before asking for support. If you need sup-
port, please send us screenshots of each step and (of possible) a scan
report together with a detailed description of your problem.
- 287 -
4.3.10.1.1 SEG-Y Wizard
- 288 -
Data specific wizards are available via Survey > Import > Seismics > SEG-Y. The
layouts are fixed for the data type selected via the menu (Data type option isn't
available in these cases).
Input file(s): Select a SEG-Y file to import. In case of importing multiple 2D or
3D SEG-Y files select any of them first and then use the wildcard *.
Import 3D pre- or poststack data from multiple SEG-Y files: files must contain
consecutive blocks of inlines and be indexed as filename_1.sgy, filename_2.sgy...
- 289 -
Import multiple 2D lines with pre- or poststack data: files must contain indi-
vidual 2D lines and be indexed with the respective line names as filename_lin-
ename1.sgy, filename_linename2.sgy...
Edit: (optionally) edit text, binary and trace headers of a SEG-Y file in Manipulate
SEG-Y File window.
Data Type: the choice is only available in the generic Import SEG-Y Data window.
l 3D seismic data
l 3D PreStack data
l 2D seismic data
l 2D PreStack data
Table : information required to import a SEG- Y file (therefore the table layout
depends on the data type).
SEG-Y Revision (default = byte 301 of binary header): please refer to SEG stand-
ards for details.
l SEG-Y Rev. 0:
o Data format, Number of samples, Z Range start/interval can be overruled.
o Data positioning (IL/XL, Trace/SP, X/Y and offset): byte locations can be selected
by a user.
l SEG-Y Rev. 1:
o Data format, Number of samples, Z Range start/interval can be overruled.
o Data positioning (IL/XL, Trace/SP, X/Y and offset): standard byte locations are
used.
l SEG-Y Rev. 2:
o File format options and standard trace header bytes are same as Rev.1.
- 290 -
Most header values and data samples are written using several bytes for
each word/sample. Therefore knowing a correct byte order is a necessity.
All SEG-Y standards (Rev. 0, 1 and 2) require using big-endian byte order.
Occasionally one can run into data written using little-endian (reverse) one.
Using standard SEG-Y data formats for reading such data results in unex-
pected scanned values of trace headers and unexpectedly large sample
values (check the histogram). In this case use data formats with (byte
swapped) option.
l start (default = bytes 105 laga and 109 delrt of trace headers): (optionally) overrule
start of Z-range (negative start is allowed).
l interval (default = byte 17 of binary header / byte 117 of trace header): (optionally)
overrule Z-sampling (sampling rate).
Data positioning:
Rev.1/Rev.2: trace header byte locations are standard, i.e. can't be selec-
ted by a user. If the file is wrongly tagged as Rev.1/Rev.2, over-rule it as 0
in Actually use column in order to be able to select non-standard trace
header bytes.
- 291 -
2D poststack data is loaded based on trace numbers, reference numbers (SP)
and X/Y coordinates.
Trace number must be unique for each trace along the line, therefore it can
be either sequential trace number (byte 5) or CDP trace number (byte 21).
A user is always allowed to select a non-standard byte even when the file
is Rev.1.
l Offset range
o In file: from a specified trace header byte.
o From Src/Rcv (X/Y): calculate from source and receiver X/Y coordinates (standard
byte locations are used: 73 and 77 for source, 81 and 85 for receiver).
o Generate: generate offsets by providing the offset of the first trace in a gather and
Store this setup: save a SEG-Y import setup at the survey data root level.
Scan the entire input: updates Quick scan result with Full scan result of the
entire input file.
- 292 -
Examine input file: opens SEG-Y Examiner window for a specified Number of
traces to examine (default=1000 traces).
Note that scalco is the scale factor for all coordinate bytes with value plus
or minus 10 to the power 0, 1, 2, 3, or 4 (if positive, multiply, if negative
divide).
l Percentage clip for display (default=0.1%): amount of data in the histogram tails
excluded from the plot.
l Zeros (default=unchecked): allows to include/exclude value 0 for histogram display.
2D (pre- and poststack) data Z-range can vary per SEG-Y file if multiple lines of
different vintages are imported at the same time:
l File Z's (available only when multiple 2D lines are imported): if checked, Z
Rangestart/interval and Number of samples are used as they appear in each SEG-Y
file.
- 293 -
4.3.10.1.1.1 Import Window
Copy data:
l Yes (import):
o data is imported to CBVS format (internal OpendTect format for seismic data);
l No (scan and link):
o link to a SEG-Y file, i.e. no data duplication;
o if a SEG-Y file is moved or renamed outside OpendTect, the link can be restored
Null traces:
- 294 -
Output Cube : type in a cube name to be used in the OpendTect project and
choose its format:
Line name:
- 295 -
l Single 2D line: either type in a new line name or select one of the existing ones.
l Multiple 2D lines: the field is greyed out, and line names come from parts of file
names replaced by a wildcard *.
Null traces:
l The trace headers: use trace headers as specified in import SEG-Y Data wizard.
l A 'Nr X Y' file (bend=points needed): (optionally) the coordinates can be specified
using an auxiliary navigation file. The format should be an ascii file with one position
per line in a fixed column format without header: trace number, X and Y coordinates.
Units to be used are the same as specified in the survey definition
o Single 2D line: select a file containing a navigation survey.
o Multiple 2D lines: navigation files must have same names as corresponding SEG-
- 296 -
2D datasets are used in OpendTect to group data of the same kind which
can be processed and interpreted together (i.e. a particular 2D vintage or
type of data). An OpendTect survey can have many 2D datasets, which
can share common 2D geometries.
- 297 -
4.3.10.1.2 Classic SEG-Y Import Tool
The Classic SEG-Y Import Tool used to be the main utility for import of SEG-Y data
prior to OpendTect 6.0. This wizard can still be launched via:
l Survey > Import > Seismics > SEG-Y > Classic tool;
l from the SEG-Y Wizard introduced in OpendTect 6.0 by clicking on 'Classic'.
It allows to import 2D and 3D pre- and poststack seismic data from one or multiple
SEG- Y files. The import consists of two mandatory steps: SEG- Y Import Pre-
paration and SEG-Y Import itself.
- 298 -
4.3.10.1.2.1 SEG-Y Import Preparation
Import 3D pre- or poststack data from multiple SEG-Y files: files must contain
consecutive blocks of inlines and be indexed as filename_1.sgy, filename_2.sgy...
Import multiple 2D lines with pre- or poststack data: files must contain indi-
vidual 2D lines and be indexed with the respective line names as filename_lin-
ename1.sgy, filename_linename2.sgy...
- 299 -
4. 2D SEG-Y multi-import window replace line name in the file name with a wildcard #L
(see SEG-Y Import page for more details)
Manipulate...: opens Manipulate SEG-Y file window where text, binary and trace
headers of a SEG-Y file can be edited.
Multiple files (default = toggled off): toggle on if a single 3D dataset (pre- or post-
stack) is imported from multiple SEG-Y files (see above how to Import 3D pre- or
poststack data from multiple SEG-Y files):
l Numbers: specify indexes of the first and the last files as well as index Step.
l 3D Volume
l Pre-Stack Volume
l 2D Line
l Line 2D Pre-Stack
l Save as default: (optionally) check to save the Number of traces to examine in your
user settings.
In most cases you can press Next after you have selected input file(s). Con-
sider overruling options listed below (information coming from the file(s)
itself) only if you have a proiri knowledge about problems with the file(s).
Overrule SEG-Y number of samples (default = toggled off, i.e. standard locations
are used: byte 21 of binary header / byte 115 of trace header): (optionally) overrule
the number of samples per trace.
Bytes swapped (default = toggled off, i.e. big-endian byte order used): (optionally)
toggle on to use little-endian byte order for reading data.
Most header values and data samples are written using several bytes for
each word/sample. Therefore knowing a correct byte order is a necessity.
- 300 -
All SEG-Y standards (Rev. 0, 1 and 2) require using big-endian byte order.
Occasionally one can run into data written using little-endian (reverse) one.
Using standard SEG-Y data formats for reading such data results in unex-
pected scanned values of trace headers and unexpectedly large sample
values (check the histogram in SEG-Y Examiner).
Store this setup: save a SEG-Y import setup at the survey data root level.
- 301 -
4.3.10.1.2.2 SEG-Y Revision
After SEG-Y import preparation a pop-up question asks to Determine SEG-Y Revi-
sion of the input file(s). The default choice is auto-selected based on byte 301 of
the file's binary header, but can be overruled by a user. The choice determines the
layout of import SEG-Y window.
l No: the file is NOT SEG-Y Rev.1 - treat as legacy (i.e. Rev.0) - For un-lucky users
the revision 1 flag might wrongly be set to "Yes" in the line header while obviously the
file does not comply with the SEG-Y revision 1 norm. It may happen when the soft-
ware blindly copies entire headers without refreshing all the necessary characters.
"Rev.0" is an older way of loading SEG-Y files: The inline and crossline offset (bytes)
must be present in the trace headers and their offsets must be provided to the soft-
ware since it may vary from file to file.
l Mostly: It's Rev.1 but I may need to overrule some things - The file is 100% SEG-
Y Rev.1 but you would like to overrule some particular information e.g. Coordinates,
Sampling rate and Start time etc.
- 302 -
l Yes: I know the file is 100% correct SEG-Y Rev.1 - For lucky people, this is by far
the most easy and quick way to import your SEG-Y file. No additional settings are
required except the final output volume/line name in the OpendTect database.
After selecting appropriate option, press Next to reach the import window.
You can save the answer to this question in your survey settings by activating the
option "Don't ask again for this survey". If you set this flag by mistake and wish to
go back you need to edit the ".defs" file (in the OpendTect survey directory) with a
text editor and remove the line "SEG-Y Rev. 1 policy:".
- 303 -
4.3.10.1.2.3 SEG-Y Import
Rev.1
If the file is Rev 1 standard then the import is almost complete: You must provide
an output name and can optionally sub-select a range of the volume to be loaded
and/or change the output format and/or re-scale the amplitudes. Pressing Go will
launch the import.
Depth volumes can be imported in time surveys and vice versa by using
the depth/time toggle. They can be visualized using transformed scenes,
providing that velocities are available.
Mostly Rev.1
If the file is mostly Rev 1 but with changed parameters you will receive three addi-
tional fields that can overrule the values to be found in the headers:
- 304 -
l Overrule SEG-Y coordinate scaling: all trace coordinates are multiplied by this scalar.
l Overrule SEG-Y start time/depth (units: ms, m or ft): time/depth of the first sample of
the traces (can be negative).
l Overrule SEG-Y sample rate (units: ms, m or ft): provide the data sampling rate.
Those parameters constitute a SEG-Y setup that can be saved and retrieved using
the yellow folder icon on the right. This setup will not only contain the parameters
but also the path of the input files and settings of the preparation step. The setups
are data dependent therefore they are stored in your survey.
Not Rev.1
If the file is not Rev 1 you will get the overrule fields described above in a tab and
two additional tabs to provide the byte locations of either the pair inline/crossline or
the pair of X and Y coordinates. In the case of prestack data an additional tab will
be present to provide the offsets/azimuth byte locations.
- 305 -
- 306 -
You need to look at the trace headers in the examine window and assign the cor-
rect settings in this import window. Once again the entire import setup may be
saved or retrieved. Once this is done you must provide an output name and can
optionally sub-select a range of the volume to be loaded and/or change the output
format and/or re-scale the amplitudes. Pressing Ok will launch the import.
The import of 2D lines is somewhat different: Inlines and Crosslines are replaced
by trace numbers, that must be unique for each trace (therefore it can be the CDP
but not the Shot Point).
Coordinates can be imported from one auxiliary file or specified manually and gen-
erated during import coordinates if missing or wrong in the trace headers (see
- 307 -
below on the right hand-side). This can be done by toggling off the X-coord byte
field.
Generate XYs
The coordinates are generated for each trace position by providing the X and Y
coordinates of the first trace, and a regular step in both directions. Units to be used
are the same as specified in the survey definition.
Optionally the coordinates can be specified using an auxiliary file. The format
should be an input ascii file with one position per line in a fixed column format
without header: File column should have the trace number, second column the X
- 308 -
coordinate, third column the Y coordinate. Units to be used are the same as spe-
cified in the survey definition.
The line name is most often part of the input file name. It will be used only if a
single line is loaded. Otherwise the line name is extracted as a part of the filename
(see further below).
The Output data set name represents a 2D survey that comprises one or more
lines. An OpendTect survey can have many 2D surveys (data sets), that are group
of 2D lines that can be selected together for processing and interpretation.
There is no format/scaling for 2D lines. The SEG- Y data format defines the
OpendTect format. However, there is a trace sub-selection option to select either a
trace "Range" or "All".
A default attribute name "seis" will be given to each line of the loaded data set.
This can be changed by pressing "Select" and filling the empty "Attribute" field, like
in the example below:
- 309 -
Multiple 2D lines loading must be enabled using the button "Import more, similar
file" on the last line before pressing "OK". Any line can be used to go through the
wizard, and the settings must be the same for all lines. If that is not the case then it
is best to run the wizard several times per group of lines of similar SEG-Y settings.
This additional window is used to specify the generic line name out of the SEG-Y
filenames. The line name must be replaced by "#L", while everything else (includ-
ing the path and the extension) is shown as text, like in the above example:
$DATAPATH/Line_#L.sgy
- 310 -
There will be one progress bar per input file during loading.
- 311 -
4.3.10.1.2.4 SEG-Y Scan
SEG-Y Scan is a useful tool to get an idea about the content of a SEG-Y file, and to
check the loading settings. It is best performed on a limited number of traces
(default is 100) when checking the loading parameters, and on the entire file when
extracting geometry and ranges.
At any moment a partial or full can be launched using the icon on the right-hand
side in the Import SEG-Y window.
SEG-Y scanning is used to derive the survey geometry for a SEG-Y file. A suc-
cessful scan will result in a successful loading, if the numerical values returned are
in line with the correct parameters (inline range, coordinate scaling ...).
- 312 -
- 313 -
4.3.10.1.3 SEG-Y Examiner
l Text (EBCDIC) header (3200 bytes: 40 lines, 80 symbols each): free-format text com-
monly containing processing history, data ranges, and non-standard trace header
byte locations.
- 314 -
l Binary header (400 bytes): scroll down to see a summary of non-zero values in the
binary header which commonly includes info about data format, SEG-Y revision and
Z-range (sampling rate and number of samples).
l Table:
o trace header byte numbers and shortened field names are shown in the first
column;
o trace header values of examined traces are shown in the following columns.
l Graph: plot between any individual trace header and the number of random traces
examined is a very good tool to visualize its range and inspect for any discrepancy:
o highlight any trace header (a row) in the table to see its graph versus consecutive
trace number;
o extended explanation of a highlighted trace header is given above the graph
Use the lateral scroll bar at the bottom to locate the headers with changing
values. Those might be trace numbers, crossline numbers, offsets, coordin-
ates etc. Inline numbers are often written to the bytes preceding crosslines.
Most 3D data is inline sorted, therefore choose sufficient number of traces
to examine (several times more than the number of crosslines per inline) in
order to see the increasing inline numbering.
- 315 -
Rubberband zoom Flip left/right
- 316 -
4.3.10.1.4 Manipulate SEG-Y File
This powerful utility (available via SEG-Y wizard or classic SEG-Y import tool )
allows to edit text, binary and trace headers of a SEG-Y file. All changes made to
SEG-Y headers can only be saved to a new Output file.
- 317 -
l Trace headers: follow the steps below to trace headers:
l Select a trace header to edit in the left list and press on icon;
l In Header Calculation window type in a Formula and press Ok.
l For information about the supported operators, functions and constants please
- 318 -
l Trace header definitions appear in the right list and may be edited or
deleted . The set of definitions can be also saved for later use and
restored when needed. Note, that the table on the right already reflects
these changes for QC (see below).
QC Trace Headers
- 319 -
l Table: displays trace headers of an individual trace number Trc.
l Plot:
l Highlight trace header(s) in the table to plot against trace number.
- 320 -
4.3.10.2 SEG-Y Scanned
Data duplication is a large problem when working with large datasets. All other
import tool generate new OpendTect files from ascii or binary files. This new type
of SEG-Y import works differently since it will not create any file but will link an
existing SEG-Y file to an OpendTect entry, selectable as any other OpendTect
data.
This special import tool is only available for both stacked and prestack data.
However, it should be noted that for prestack data the performances for reading
and processing may be lower than with OpendTect prestack datastores that are
optimized by importing the usual way. The importation itself is 100% similar to the
normal SEG-Y import. The only difference is that there will not be any loading after
completing the wizard.
Please note that since this tool links to an existing file, moving or renaming the file
outside OpendTect will break the link and make the dataset unavailable.
- 321 -
4.3.10.3 Simple File
The user can import simple ASCII or Binary file by using plain file Seismic I/O Plu-
gin. This can be reached via Survey > Import > Seismic > Simple File > 3D or 2D
(Pre/Poststack) etc.
The input file must first be selected and its data format type specified, between
ascii and binary (4-bytes floats). All data must be in the 'local' format, because a
blunt binary read/write is performed.
(Part of) the input file can be visualized by pressing the examine button. The data
must consists of one trace per record (line). The samples are thus in columns,
from shallowest to deepest, with a regular step. The trace position and time/depth
index can be read from the input file, left of the trace, or can be provided. If
provided, start, step and number of samples are requested in the corresponding dir-
ections, assuming the input file if regular and does not contain holes. Poststack
volume must be sorted by inlines, crossline, (offset), Z (time or depth).
Optionally, the user can scale the cube before loading as well by mentioning the
amount of shift and the corresponding factor. Either pass or discard the null traces
before loading.
The easiest way to see what the format looks like is by producing a little export file
from a bit of seismics. In the example below we exported inlines 500-501 and
crosslines 600-603, Z range 1000-1020 step 4 (which is 6 samples):
l 1000 4 6
l 500 600 1456 -688 -1502 4955 8935 1209
l 500 601 1429 -640 -967 5248 8362 527
l 500 602 1353 -424 -1040 5071 8059 -64
l 500 603 1428 -587 -1244 5139 8447 13
l 501 600 1450 -411 -1414 4792 8449 1117
l 501 601 1619 -456 -1243 4695 8271 702
l 501 602 1617 -213 -1272 4675 7903 393
l 501 603 1552 -248 -1088 4875 8004 204
- 322 -
- 323 -
Simple poststack 3D and 2D Seismic File Import Window
- 324 -
- 325 -
Simple Prestack 3D and 2D Seismic File Import Window
- 326 -
4.3.10.4 Import CBVS Cube
Seismic in CBVS format can be import either directly from cbvs file or from other
survey(s) as shown below
- 327 -
From file
This module enables exchange of data between the OpendTect projects. The ori-
ginal CBVS (Common Binary Volume Storage) file can be located with a standard
file browser. Some CBVS volumes are stored in several sub files. These can be
recognized by the ^01 or ^02 (etc.) in the filename. To import the complete volume,
select the base file without any ^xx marks.
The Cube type needs to be specified in order to give it the correct label for the soft-
ware.
The Import mode indicates if the file should only be left at its original place and just
be linked to the current survey (Use in-place), or if the volume should be copied
entirely into the current survey directory (Copy the data). Moreover, while import-
ing, the volume can be sub-selected (selected inlines/crosslines/time ranges) by
pressing Select button in front of the Volume subselection field. If the data contains
the Null traces, either discard or pass the traces by selecting the respective radio
button. Before, importing the CBVS volume, the scaling (16bit, 32 bit etc) can be
applied to the volume. The Output Cube field corresponds to the output volume
name (that will be available in Manage Seismic window) for the input file.
- 328 -
Seismic cubes in cbvs format can also be imported from other survey(s)
Few changes can be made prior to import the file: The volume subselection, the
type of interpolation (sinc interpolation or nearest trace) and the stepout in inline
and crossline.
- 329 -
4.3.10.5 GPR - DTZ
Ground penetrating radar offers an accurate solution to mapping the subsurface of
the earth. It locates features of interest and subsurface layers in real time. The GPR
data visualization and interpretation can be made in OpendTect, which enables
the user to import the files made by GSSI Ground Penetrating Radar (GPR) sys-
tems in the 'DZT' format. The result is a 2D line in OpendTect.
Prior to loading a GPR data, the 2D survey should be setup according to the GPR
acquisition setup. The data files are then imported as 2D geometries. The fol-
lowing Import GPR Seismics window allows the user to select one line and import
the line according to the given setup. The time stamps or sampling rate in
OpendTect is defined in milli-seconds. However, the DZT files are often sampled
with nanoseconds sampling rate. To adjusted this, there is an input field available
i.e. 'Z factor' that allows re-scaling of Z-axis or time. In order to visualize the data in
OpendTect, this factor should be large enough. The remaining parameters i.e. Start
X,Y position or X/Y steps could be filled according to the profile location.
- 330 -
4.3.10.6 Tagged Seismic Data
The imported volumes may contain any data. However several types can be spe-
cified during import and/or after:
l Depth poststack volumes/lines loaded in time surveys. A check box must be toggled
on during SEG-Y or simple file import.
l Time poststack volumes/lines loaded in depth surveys. A check box must be toggled
on during SEG-Y or simple file import.
l Velocity/anisotropy volumes must be tagged with respect to their type, when imported
from an external file. The available types are:
l Vint: Interval velocity
l Vrms: RMS velocities (time domain only). A provided surface may provide elevation
statics in meters. For time surveys a statics velocity must be provided in m/s, either
from a velocity grid or using a constant velocity.
l Vavg: Defined as the ratio between the depth and the travel time: Vavg(TWT)-
)=2*Z/TWT=Z/OWT.
l Delta: Thomsen anisotropy parameter of the same name.
l Epsilon: Thomsen anisotropy parameter of the same name.
l Eta: Effective anisotropy parameter, combine from delta and epsilon. This tag can
also be used to grid another quantity (the software does not actually check that eta val-
ues are input).
The assignment of velocity types (and properties) to a volume is called velocity edi-
tion. This window can be opened in most windows wherever velocity volumes are
used (an exception is the attribute set window):
l Volume gridding
l Time-to-depth scenes
l Time-to-depth conversions
l Velocity conversions
- 331 -
The velocity volumes may be scanned to get the Vavg range at their first and last
sample. This allows the software to deduct and propose appropriate time/depth
ranges during conversions (on-the-fly and batch).
- 332 -
4.3.112 Import Velocity Functions
Velocity functions can be imported to OpendTect using ASCII files that contain pos-
ition information (e.g. X/Y or Inl/Crl), Z and Velocity values. You may display the
input file by pressing the Examine button.
After importing Velocity Function (irregularly sampled data) use Velocity Gridder to
create Velocity Field which can be displayed and used for domain conversions.
Velocity type
l Vint : Interval Velocity, the amplitude of a point accounts for the layer above it.
l Vrms: RMS Velocity can only be used in time surveys. They will be treated with a
simple Dix for the extraction of the time-depth relationship. If this type is selected, you
need to specify if it has statics or not.
l Vavg: Average Velocity is the ratio at a given depth between depth and travel-time:
Vavg = 2*Z/TWT = Z/OWT. As a result a Vavg quantity holds entirely the data from
time-depth pairs.
l Delta, epsilon and eta functions will be vertically interpolated with linear inter-
polation.
- 333 -
Format definition
Predefined and saved file formats are available by pressing the icon. Otherwise
the format must be manually specified. The Define button gives access to the
Format definition window. If Coordinate Reference System (CRS) is defined for the
survey, CRS conversion will be available in the import window.
It is recommended to save the format definition for a later use by clicking on the
icon . In the pop-up window, write the name of the format and store it at an appro-
priate level (All surveys, This survey only, or My user ID only) depending on the
usage.
- 334 -
4.3.13 Import Wavelets
Wavelets can be imported into OpendTect as ASCII files. The input file should be
column sorted with one time/depth sample position per record. The input file can
be displayed by pressing the Examine button.
Wavelets can be used for synthetic-to-seismic tie, or convolution via the convolve
attribute. Wavelets are also used to store all kinds of frequency-derived operators,
like seismic spectral blueing and seismic coloured inversion operators.
Format definition
Pre-defined and saved file formats are available by pressing the icon. Other-
wise, the format must be manually specified. The Define button gives access to the
Format definition window:
l Center sample: a number of lines between the first sample and the center sample (if
provided incorrectly, a warning message suggests an auto-predicted number).
l Data samples: a column containing amplitude values.
l Stop reading at: optional keyword to stop reading a file.
- 335 -
When file header is present, both sampling interval and center sample can be
dynamically extracted from it by providing keywords for each, or indexed using
columns positions.
It is recommended to save the format definition for a later use by clicking on the
icon . In pop-up window, write the name of the format and store it. The format can
be stored at different levels (All surveys, Current survey, Current OpendTect user
level) depending on the usage. Press OK when done.
- 336 -
- 337 -
4.3.14 Import Wells
Well data in OpendTect is organized into four sub-categories: well tracks, well
logs, markers (well tops) and time-depth models. Each category can be imported
via Survey > Import > Wells menu:
Alternatively, import (and editing) of well data is available from the Manage Wells
window (Survey > Manage > Manage Wells):
- 338 -
Reference datums used in OpendTect are schematically shown in the figure
below:
- 339 -
Well depths in OpendTect are always referenced using their Measured Depth
(MD). The alignment with seismic data is done using well track data (deviation sur-
vey) and time-depth (and/or checkshot) data. The well track data provides the rela-
tion between lateral coordinates, True Vertical Depth Sub Sea (TVDSS) and MD
values. The time-depth data provides the relation between MD and Two Way
Times (TWT).
- 340 -
4.3.13.1 From ASCII Files
Single well import is available via Survey > Import > Wells > ASCII menu:
l Track: import of deviation survey and time-depth model (and/or checkshot data).
l Logs: import of well logs from LAS files.
l Markers: import of markers (well tops).
- 341 -
4.3.13.1.1 Track
Well track (deviation survey) of a single well can be imported to OpendTect using a
column sorted ASCII file or defined as vertical via Survey > Import > Wells > ASCII
> Track... In time surveys, time-depth model must be either imported using column
sorted ASCII file or temporarily defined as constant velocity at this step.
The well track is the core part of a well, it is required for the visualization of
well data in depth and further loading of markers and logs. The well track
determines the size of the usable and displayed well data. Any log or
marker outside of the track Z range will neither be usable nor be displayed.
On the other hand the well track is not limited to the survey Z range and
can be loaded outside the survey box.
- 342 -
Well Track File
Checked box at the top of Import Well Track window allows to select an ASCII file
containing a well track, vertical or deviated.
- 343 -
Reference datums used in OpendTect are schematically shown in the fig-
ure below (note that Measured Depth [MD] is always referenced from Kelly
Bushing [KB]):
- 344 -
For a deviated well, the file must contain 4 columns: position information
(X/Y or Inl/Crl), true vertical depth sub sea (TVDSS) and MD.
For a vertical well, the file must contain at least 3 columns: position inform-
ation(X/Y or Inl/Crl) and at least one depth column, TVDSS or MD (Refer-
ence Datum Elevation [KB] value must be specified in this case).
The best way to ensure that the reference datum elevation is properly set is
to have the deviation survey file starting at MD = 0 and TVDSS of KB. In
the example below: KB elevation of a well is 34.1 m above MSL, i.e. in
OpendTect TVDSS at KB is -34.1 m, which corresponds to MD=0.0 m:
X Y Z (TVDSS) MD
623255.98 6082586.87 -34.10 0.00
623255.98 6082586.87 0.00 34.10
623255.98 6082586.87 65.90 100
623255.84 6082591.69 440.86 475
Format definition
Predefined and saved file formats are available by clicking on icon. Otherwise
the format must be manually specified by clicking on Define button and selecting
column numbers corresponding to position information (X/Y or Inl/Crl), Z and MD. If
Coordinate Reference System (CRS) is defined for the survey, CRS conversion
will be available in the import window.
- 345 -
l X and Y are absolute coordinates (not relative to the surface coordinates) and must
have same units as the OpendTect survey coordinates.
l Z is TVDSS, increasing downwards and equal to zero at sea level.
l For a vertical well, either Z or MD can be left unspecified (col:0). In this case Refer-
ence Datum Elevation [KB] value must be provided in the main Import Well Track win-
dow.
It is recommended to save the format definition for a later use and QC, by clicking
on icon. The format can be stored at different levels (All surveys, This survey
only, or My user ID only) depending on the usage.
Vertical well
- 346 -
Unchecked box at the top of Import Well Track window allows to create a vertical
well by entering its surface coordinates, Reference Datum Elevation [KB] and Total
Depth [TD].
Time-Depth Model
If checked Depth to Time model file, a file containing the time-depth relation model
can be provided as an ASCII file containing depth as TVDSS, TVD-SRD or MD. If
time-depth model is unavailable, the check box at the left of this field can be
deselected and temporary model velocity value (m/s) should be provided.
Predefined and saved formats are again available by pressing the icon. Other-
wise the format must be manually specified. The Define button gives access to the
format definition window.
You must specify in the format definition window the column where depths and
times are located, and the type of data to be expected. Three types of depths are
supported for loading a check-shot/time-depth curve from a file. The supported
depths are: MD, TVDSS, TVD rel SRD. Time values can be either one-way or two-
way traveltimes. Times (lines) that should not be read must all have the same
numerical value, which is to be filled in as the Undefined value".
Time-depth models are always stored using measured depths and two-way travel
times in seconds. Therefore any other input format will cause a conversion of the
input data. Data loading can be stopped at a specific line by providing the
adequate keyword.
- 347 -
It is mandatory that the time-depth model obeys the following requirement:
TWT = 0.0 ms corresponds to TVDSS at SRD. The best way to ensure this
is to have such line in the imported file. For example, if SRD is 1000.0 m
above MSL, i.e. in OpendTect TVDSS at SRD is -1000.0 m, then the file
should contain a line with the following TVDSS (m) - TWT (ms) pair: -
1000.0 m - 0.0 ms.
It is highly recommended that the 2nd sample of the time-depth model cor-
responds to the start depth of your sonic log, unless the input is a meas-
ured checkshot survey.
The Time-Depth model used during import can be either a checkshot model or a
"normal" time depth curve. More information can be found in the well management
chapter.
- 348 -
Advanced/Optional
l Surface Coordinate: if provided, the coordinates written in the first line of the track
file will be overruled.
l Replacement Velocity: interval velocity from KB to SRD.
l Ground Level Elevation: elevation of GL above MSL.
l Unique Well ID: unique well identifier which can be used during import of well logs
and markers.
l Operator, State and County: text details about a well.
- 349 -
4.3.13.1.2 Logs
Logs of a single well can be imported to OpendTect as LAS or pseudo-LAS file via
Survey > Import > Wells > ASCII > Logs... The import of well logs requires the well
track to be imported first.
Import Logs: The LAS file should contain depth values as MD or TVDSS. Altern-
atively, the log files can be pseudo-LAS, meaning LAS (with one line of data per
depth value) with the header replaced by a one-line definition: "Depth Gamma
Sonic" etc (without quotes). Log names should be separated by blank characters
(space or tab). For both LAS and pseudo LAS, the following units can be recog-
nized. The recognition process is case insensitive.
Once the file has been selected all recognized logs will be listed in the Select logs
section. Only the highlighted logs will be imported. Be careful that two logs do not
- 350 -
have the same name. The depth interval can be limited to a sub-range. The start
depth, stop depth and step written in the LAS files are not used; instead the depths
found on the same line as the amplitudes will be used.
In pseudo LAS, units should follow directly behind the log name in parentheses,
e.g. Depth(ft) Density(g/cc). Below are examples of text string that will match units:
- 351 -
4.3.13.1.3 Markers
Markers of a single well can be imported to OpendTect as ASCII files via Survey >
Import > Wells > ASCII > Markers...The import of markers requires the well track to
be imported first.
In the Edit Well Markers window click on Import button to display Import Markers
window.
- 352 -
Input ASCII file should contain names of the markers and depth values as MD or
TVDSS and can be displayed by pressing the Examine button.
Format definition
Predefined and saved file formats are available by pressing the Open icon .
Otherwise the format must be manually specified. The Define button gives access
to the format definition window.
- 353 -
Column numbers of the marker name and depth should be specified. Please mind
the spaces in the marker names that can break the fixed column format.
It is recommended to save the format definition for a later use and QC, by clicking
on the Save icon . In pop-up window, write the name of the format and store it.
The format can be stored at different levels (All surveys, Current survey, Current
OpendTect user level) depending on the usage. Press Ok when done.
- 354 -
4.3.13.2 Import Zero-Offset VSP
A zero-offset VSP data can be imported for a selected well via Survey > Import >
Wells > VSP (SEG-Y)... First browse and locate the input file. Then in the Import
SEG-Y Data window check the quick scan results and press Next when done.
In the pop-up Import Zero-offset VSP window, select the type of input Z values
(TWT, TVDSS or MD), select the well to which the VSP log should be added and
press OK to import the log.
- 355 -
- 356 -
4.3.13.3 Simple Multi-Well
This utility window allows the quick creation of multiple vertical wells with a con-
stant velocity as depth-time model provider. The table window below can either be
filled manually or by reading a file.
l Well name
l (Vertical) position along the X axis, in the same unit as the survey geometry.
l (Vertical) Position along the Y axis, in the same unit as the survey geometry.
l Reference datum elevation (KB or other): Altitude measured from sea level of the
point MD = 0., positive upwards. Can be left to 0 if unknown.
l Total depth (TD): Largest measured depth in the well. This parameter is half optional;
If not provided the well track is created such that it will reach the survey base.
l Seismic reference datum (SRD): Altitude measured from sea level of the point TWT =
0 ms, positive upwards.
l UWI (Unique well identifier): You can input any number, string or combination.
- 357 -
To read a file containing that information, press Read file and select the input
ASCII file. One line in this file should correspond to one line in the output table.
- 358 -
The main work is to specify the presence of a file header and the file format defin-
ition. The header, if present, can be of fixed length (number of lines), or delimited
on its last line by a keyword.
Predefined and saved file formats are available by pressing the icon. Otherwise
the format must be manually specified. The Define button gives access to the
format definition window.
- 359 -
You must specify in the format definition window the column numbers of the X and
Y coordinates (absolute values, not relative to the surface coordinates), in the
same unit as used when defining the OpendTect survey. Reference datum elev-
ation and TD should also be provided, while the SRD and UWI are less frequently
used. Please note that KB and SRD both increase upwards and are positive above
sea level, whereas MD is a depth and increases downwards (MD is never neg-
ative).
It is recommended to save the format definition for a later use and QC, by clicking
on the icon. In pop-up window, write the name of the format and store it. The
format can be stored at different levels (All surveys, Current survey, Current
OpendTect user level) depending on the usage.
- 360 -
- 361 -
4.3.13.4 Bulk
The bulk import tool is available via Survey > Import > Wells > Bulk menu. It allows
to import well tracks, time-depth models, logs and markers for different wells from
one or several files. The data is matched against primarily the well name and, if
available, against the Unique Well Identifier (UWI). This has the following implic-
ations:
l The well name must appear on each line of the input file. If the well already exists,
then the UWI must match the database. The same applies for the UWI if it is used in
combination with the well name.
l The well name should not contain spaces, otherwise the matching with a given
column number will not work as expected.
- 362 -
4.3.13.4.1 Bulk Well Track Import
Well tracks can be imported for several wells in bulk from a single ASCII file via
Survey > Import > Wells > Bulk > Track... The specification for the input data is sim-
ilar to the single well import.
- 363 -
You will have the option to select either the well Name or UWI (Unique Well Iden-
tifier). And also to set depth as either MD or TVDSS. You may also toggle on the
'Stop reading at' choice and set a value here.
- 364 -
4.3.13.4.2 Bulk Well Log Import
Several LAS files can be imported for different wells in bulk via Survey > Import >
Wells > Bulk > Logs...
If the well name in the file does not match the current well database, it may be used
to create a track and dummy time-depth model if necessary. Well tracks and time-
depth models can be later imported from the well manager.
- 365 -
4.3.13.4.3 Bulk Well Marker Import
Markers can be imported for several wells in bulk from a single ASCII file via Sur-
vey > Import > Wells > Bulk > Markers.... The specification for the input data is sim-
ilar to the single well import.
- 366 -
You will have the option to select either the well Name or UWI (Unique Well Iden-
tifier). And also to set depth as either MD or TVDSS. You may also toggle on the
'Stop reading at' choice and set a value here.
- 367 -
4.3.13.4.4 Bulk Well Time-Depth Model Import
Time- depth models can be imported for several wells in bulk from a single
ASCII file via Survey > Import > Wells > Bulk > Depth/Time model... The spe-
cification for the input data is similar to the single well import.
- 368 -
You will have the option to select either the well Name or UWI (Unique Well Iden-
tifier). And also to set depth as either MD or TVDSS. You may also toggle on the
'Stop reading at' choice and set a value here.
- 369 -
4.4 Export
Most of the data types can be exported from OpendTect via Survey > Export drop
down menu of the main window.
Export of well data is available from the Manage Wells window (Survey > Manage
> Manage Wells):
- 370 -
Direct data exchange with Schlumberger's Petrel is available via PetrelDirect plu-
gin (part of OpendTect Pro)
- 371 -
4.4.1 Export Faults
Faults can be exported as ASCII files via Survey > Export > Fault > ASCII...
In the Export Fault window: select a fault; specify the output format; type an output
file name (to save to Survey Data Root folder) or provide a full path by clicking
Select; select a coordinate system of the output file (the option is available only if
the current survey has a defined projection based coordinate system), and press
Export.
- 372 -
4.4.2 Export FaultStickSets
Fault stick sets can be exported as ASCII files via Survey > Export > FaultStick-
Sets > ASCII...
In the Export FaultStickSet window: select a fault stick set; specify the output
format; type an output file name (to save to Survey Data Root folder) or provide a
full path by clicking Select; select a coordinate system of the output file (the option
is available only if the current survey has a defined projection based coordinate
system), and press Export.
- 373 -
4.4.3 Export Geometry 2D
2D line geometries can be exported as ASCII files via Survey > Export > Geometry
2D > ASCII...
In the Export 2D Geometry window: choose one or several 2D lines; type an output
file name (to save to Survey Data Root folder) or provide a full path by clicking
Select; and press Export. The output file contains 4 columns: 2D line name, trace
number, X and Y coordinates.
- 374 -
4.4.4 Export Horizons
2D and 3D horizons can be exported as ASCII files via Survey > Export >
Horizons.
- 375 -
4.4.4.1 Export Ascii 3D Horizons
3D horizons (surfaces) can be exported as ASCII files via Survey > Export > Hori-
zons > ASCII 3D...
In the Export Horizon window: select a 3D horizon; optionally include available Cal-
culated attributes; specify the output format; type an output file name (to save to Sur-
vey Data Root folder) or provide a full path by clicking Select; and press Export.
A successful export is confirmed with a message, and the export window stays
open to export more horizons.
- 376 -
Calculated attributes: one or more attributes from the stored horizon data can be
exported along with a horizon.
Horizon data can be created via Processing > Create Horizon Output, or an attrib-
ute calculated on-the-fly on a horizon in the 3D scene can be Saved as Horizon
Data . More information about calculating attributes in OpendTect can be found
here.
Output Type:
l default OpendTect horizon export format: based on either X/Y or Inl/Crl coordinates;
l pre-defined GeoFrame IESX (3d_ci7m) format: Settings button appears and allows to
change Horizon name in file and add Comment.
- 377 -
l Linear velocity: a linear velocity function based on a starting interval velocity Vint
and a velocity gradient Gradient
Coordinate System: select a coordinate system of the output file (the option is
available only if the current survey has a defined projection based coordinate sys-
tem).
- 378 -
4.4.4.2 Export Ascii 2D Horizons
2D horizons can be exported as ASCII files via Survey > Export > Horizons >
ASCII 2D...
In the Export 2D Horizon window: select a 2D horizon; from the list of 2D lines
choose which ones to export; specify the output format; type an output file name (to
save to Survey Data Root folder) or provide a full path by clicking Select; and press
Export.
- 379 -
Output Format options
Select Lines: selection from the list of 2D lines on which the selected horizon is
present.
Header:
Coordinate System: select a coordinate system of the output file (the option is
available only if the current survey has a defined projection based coordinate sys-
tem).
- 380 -
4.4.5 Export Mute Functions
Mute functions can be exported as ASCII files via Survey > Export > Mute Func-
tions > ASCII...
In the Export Mute Function window: select a mute definition; choose coordinates
format; type an output file name (to save to Survey Data Root folder) or provide a
full path by clicking Select; select a coordinate system of the output file (the option
is available only if the current survey has a defined projection based coordinate
system), and press Export . The output file contains 4 columns: X/Y or Inl/Crl
coordinates, offset and Z (time or depth) values.
A successful export is confirmed with a message, and the export window stays
open to export more mute functions.
- 381 -
4.4.6 Export Pointsets & Polygons
Pointsets and polygons can be exported as ASCII files via Survey > Export > point-
sets/Polygons > ASCII...
- 382 -
4.4.7 Export Probability Density Functions
Probability Density Functions (PDFs) can be exported as ASCII files in Icon
Science's RokDoc format via Survey > Export > Probability Density Functions >
ASCII (RokDoc)...
In the Export Probability Density Function window: select a PDF; type an output
file name (to save to Survey Data Root folder) or provide a full path by clicking
Select; and press Export.
A successful export is confirmed with a message, and the export window stays
open to export more PDFs.
- 383 -
4.4.8 Export Seismics
2D/3D poststack and 3D prestack seismic data can be exported as SEG-Y or
simple files via Survey > Export > Seismics.
- 384 -
4.4.8.1 Export SEG-Y
2D/3D Poststack and 3D Prestack data can be exported from OpendTect in SEG-Y
format:
The SEG-Y revision 1 default bytes locations will be used during export, but addi-
tional positions can be used with the personal setting keywords listed on the right-
hand side.
The point is a trace attribute stored in the OpendTect seismic files. It is most often
not used.
The reference number is most often the Shot Point for 2D data but could be used
for anything else. Please note that a SEG-Y scalar at bytes 201-202 apply for val-
ues stored in bytes 197-200. The SP scalar will always be -10, thus the value writ-
ten on bytes 197-200 is 10 times the SP value.
- 385 -
All values listed above are encoded on 4 bytes by default. The byte length can be
overridden using the following personal key words, except for the coordinates:
The layout of the SEG-Y export window changes slightly based on the data type.
- 386 -
4.4.8.1.1 Export SEG-Y 3D
Stored 3D volumes can be exported from OpendTect in SEG-Y format.
All fields are optional, except the output filename that must be provided. The export
will be launched when pressing Ok.
- 387 -
Multi-Component Export
If the input cube contains multiple components, an additional window will pop-up
on pressing 'Run' and ask for which component to output (see below), since SEG-
Y files can only contain one component per file.
Tip: It is a good practice to display the 3D seismic data on a z-slice to check for
any gaps in inline/cross-lines or the presence of null traces, before exporting it.
- 388 -
4.4.8.1.2 Export SEG-Y 2D
Stored 2D data can be exported from OpendTect in SEG-Y format.
All fields are optional, except the output filename that must be provided. For 2D
you also need to select a specific line. More lines can be exported if the option
'Export more from same dataset' on the last line is selected.The export will be
launched when pressing Ok.
- 389 -
l Text header: The SEG-Y textual header is automatically created, but may be
provided by the user, either from a text file or directly from a SEG-Y file.
- 390 -
4.4.8.1.3 Export SEG-Y Prestack 3D
Prestack 3D data may be exported from OpendTect in SEG-Y format.
All fields are optional, except the output filename that must be provided.
- 391 -
l SEG-Y format: Please note that this option may clip your data if the output format has
less bytes than the input OpendTect format.
l Text header: The SEG-Y textual header is automatically created, but may be
provided by the user, either from a text file or directly from a SEG-Y file.
- 392 -
4.4.8.2 Export Simple File
- 393 -
4.4.8.2.1 Export 3D Simple File Format
The stored volumes in the OpendTect survey can also be exported as a simple
Ascii or binary file.
- 394 -
l Put sampling info in file start: Select Yes to allow the sampling information to
appear at the beginning of the file.
l File type: Select the appropriate output file type: Ascii or Binary.
l Output file: Select/write the output file location.
Multi-Component Export
If the input cube is a stored multi-component data, an additional selection box will
appear in the window. Select the desired component to be exported as an Ascii or
Binary file.
- 395 -
4.4.8.2.2 Export 2D Simple File Format
A stored data set in the OpendTect survey can also be exported as a simple Ascii
or binary file.
- 396 -
Simple 2D Seismic File Export Window
- 397 -
4.4.8.2.3 Export PreStack Simple File
In OpendTect, you can also export a simple seismic Ascii or Binary file from Pre-
Stack 3D seismic data via Survey > Export > Seismic > Simple File > Prestack 3D.
- 398 -
- 399 -
4.4.8.3 Cube Positions
Number of inlines
.....
- 400 -
4.4.9 Export Wavelets
- 401 -
Wavelet Export Window
After selecting the wavelet to export, you can optionally also output the time (if time
survey) or depth (if depth survey). Give an appropriate name and storage location,
the wavelet will be exported when clicking on Ok.
- 402 -
4.5 Manage
OpendTect keeps track of the different files imported into or created by the system.
Deleting, renaming, setting as default, merging of files (in some cases) …etc are
controlled from the Manage window. Most objects (seismic volumes, horizons, well-
s…etc) have a dedicated ‘manager’ that can be called from this menu.
The most frequently used managers can be reached directly from the main user
interface:
- 403 -
- 404 -
4.5.1 Manage Attribute Sets
The attribute set files can be managed from this window (See below). It is
launched from Survey > Manage > AttributeSets...
There are listed all the 2D and 3D attribute sets you have created or imported in
the Attribute Set window . You can modify the attribute set name, set as default,
remove etc. The icons are similar to the one from the general selection window.
- 405 -
Use the top filter to find the wanted element(s) by typing the name or a part of the
name (complete the name with *): for example, to find ‘Demo attributes’, you can
type *Demo*.
To save or restore a selection, right-click on a listed item and select the wanted
action in the pop up menu.
- 406 -
4.5.2 Manage Bodies
Bodies created or imported to OpendTect are listed in this manager (See below).
The manager can be accessed either following Survey > Manage > Bodies… or
from the quick launch icon .
Here you can rename, lock for editing, delete and set as default any body. The
icons are similar to the one from the general selection window.
Additionally, four specific tools are available (for more information click on the links
below this page):
Body Operator
- 407 -
Use the top filter to find the wanted element(s) by typing the name or a part of the
name (complete the name with *): for example, to find ‘Amplitude anomaly A’, you
can type *anomaly*.
To save or restore a selection, right-click on a listed item and select the wanted
action in the pop up menus.
- 408 -
4.5.2.1 Body Operator
The body operator, accessed from the icon in the Bodies manager, enables to
perform various operations on a geological body (or combinations thereof), as lis-
ted below.
For all the operators, click on the Input to select either a saved body or create a
body using an operator. Each operation needs 2 inputs. You can combine more
than one input using an operator as input. After giving a name to the new body, it
will be created after clicking on Run.
- 409 -
Union
You can merge two or more bodies using the union operator as in the figure below:
Intersection
- 410 -
If you want to create a body from the shared portion of two bodies, you can use the
intersection operator.
Difference
- 411 -
If the operator is Difference, the output will be a result of the subtraction Input1-
Input2.
The result of the 'Difference' operation (the final body is the volume resulting from
Body 2 minus Body 1).
- 412 -
4.5.2.2 Body Region Constructor
This tool allows you to create a regional body from your dataset using bound-
aries defined horizons and faults (or just a single wrapping horizon).
- 413 -
4.5.2.3 Estimate Body Volume
The tool allows you to estimate the volume of any body. It can be accessed
either from the manager or from the tree (right-click menu on a body listed in the
tree or in the 3D scene).
The input velocity is an (estimated) velocity appropriate to the position of the body.
A default velocity is provided but needs to be updated according to your survey.
Click on Estimate to get the estimated volume (in m3).
- 414 -
4.5.2.4 Switch Body Values
The icon activates a window with dual functionality.
l If you have built a regional body using two horizons and a fault, creating a kind of
'compartment', this tool allows you to 'flip' this body to create its 'negative'.
l Previous versions of OpendTect had bodies in different formats. This tools can also
be used to 'convert' these various formats into the standard format for 4.6.
- 415 -
4.5.3 Manage Color Tables
The Color Table Manager gives access to the various settings used for data visu-
alization to edit an existing colour table or create a new one.
The manager is accessible from Survey > Manage > Color Tables and from the
right-click menu on the colour table in the 3D scene (See below).
- 416 -
For a full description of the various options and possibilities, please see the fol-
lowing section: Color Tables
- 417 -
4.5.4 Manage Cross-Plot Data
The crossplot file management is used to rename, remove, merge etc the stored
crossplot data files.
- 418 -
make the selected crossplot as a default data for crossplotting. If pressed, the
selected item will be bounded by the signs (> Name <).
To merge two crossplots with different or similar attributes, you need to select a first
crossplot and then after clicking on the icon, choose the second crossplot in the
following window:
. When pressing Ok, opens the window where you have to provide different inform-
ation. If some columns have similar quantity in the two crossplots, it is possible to
specify for each column, which column from the crossplot2 matches a column from
crossplot1, even if they do not have the same name. If there is no match, just select
None, then you can decide to either add the unmatched columns or to ignore them.
- 419 -
Then select the matching method:
For all of them, the replacement policy for matching positions has to be specified :
take value from crossplot1, take value from crossplot2 or take the average of the
two. Undefined value can be kept or replace if possible.
Once the merging parameters have been defined, you can give an appropriate
name and save it in clicking on Ok.
- 420 -
4.5.5 Manage Faults
The faults interpreted/imported in the OpendTect project are listed in the Fault Man-
ager with the possibility to perform some basic actions: rename, remove, copy
…etc. The icons are similar to the one from the general selection window.
The manager can be accessed either following Survey > Manage > Faults… or
from the quick launch icon > Faults.
- 421 -
Use the top filter to find the wanted element(s) by typing the name or a part of the
name (complete the name with *): for example, to find ‘FaultA’, you can type *A*.
To save or restore a selection, right-click on a listed item and select the wanted
action in the pop up menu.
- 422 -
4.5.6 Manage FaultStickSets
The faultsticks interpreted/imported in the OpendTect project are listed in the
FaultStickSets Manager (see below) with the possibility to perform some basic
actions: change disk location, rename, remove, copy …etc. The icons are similar to
the one from the general selection window.
The manager can be accessed either following Survey > Manage > FaultStick-
Sets… or from the quick launch icon > FaultStickSets.
- 423 -
Use the top filter to find the wanted element(s) by typing the name or a part of the
name (complete the name with *): for example, to find ‘SSIS-Grid-Faultsticks’, you
can type *Grid*.
To save or restore a selection, right-click on a listed item and select the wanted
action in the pop up menu.
- 424 -
4.5.7 Manage Geometry 2D
2D Geometry manager is launched from Survey > Manage > 2D Geometry...
This window is used to manipulate the geometry of 2D seismic lines. The geo-
metry consists of X-Y coordinate pairs for each trace of the 2D seismic, identified
with a unique trace number (CDP most often). They are generally extracted from
the SEG-Y trace headers or from an auxiliary file during import.
From this manager, the coordinates of already imported 2D data can be altered.
The geometry is separated from the actual 2D seismic data and 2D horizon that
are solely referenced with respect to the trace number (CDP number). As a result
the coordinates of the geometry can safely be edited without having to re-import
the 2D seismic data and corresponding horizons.
- 425 -
The icons are similar to the one from the general selection window.
The name of the selected line is specified on top of the window. To edit a field,
click on it and type the new value. Changes will be saved on disk only after press-
ing OK. Optionally, the entire geometry of the selected 2D line can be updated by
reading a text file by clicking on Import New Geometry (see section below).
In the selection window (see below), select the input Ascii file. The input file should
be column sorted with one point per record (line).Optionally, you can display the
input file by clicking on Examine: you will be able to check the values and it will
help you filling it the remaining information.
- 426 -
You need to specify the presence/absence of a header and its size if present. The
header, if present, can be of fixed length (number of lines), or delimited on its last
line by a keyword.
The file definition needs to be filed in to know which data corresponds to which
column.
The Define button gives access to the format definition window (see above).
You must specify in the format definition window:
l the column numbers for the position: as X/Y coordinates or Inline/Crossline. The
coordinate units must be in the same units as the coordinates of the survey corner
points. Inline/crossline can be used but it is not recommended because of the grid spa-
cing.
- 427 -
l Optionally the trace number column. It is not recommended to alter (re-specify) the
trace numbers since it may corrupt the already loaded data.
l Optionally, the reading can be stopped at a specific line by providing the adequate
keyword: the reading will stop at the first occurrence of that word.
It is recommended to save the format definition for a later use and QC, by clicking
on the icon. Predefined and saved file formats can be restored by clicking on
the icon.
- 428 -
4.5.7.1 Manage 2D Geometry
This window allows for selection and removal of geometry of data sets. The icon
, opens a further window to Manage Line Geometry.
- 429 -
4.5.7.2 Manage Line Geometry
In this window, line geometry can be edited on a per-trace, coordinate level.
The editing can be done manually, or you may make use of the option 'Import New
Geometry':
- 430 -
- 431 -
4.5.8 Manage Horizons
Manage either 2D or 3D horizons either via Survey > Manage > Horizons... or via
the icon.
- 432 -
- 433 -
4.5.8.1 Horizon Manager 2D
To open the Manage 2D Horizons window, navigate through Survey--> Manage-->
Horizons--> 2D... or use the icon from the Manage toolbar. In the left panel of the
window, the available horizons are displayed. It the bottom panel, information on
the selected horizon is displayed (eg. location on disk, date last modified). At the
base of the window the available disk space is noted.
- 434 -
Horizons can be renamed , locked , removed , copied , set as default
or viewed as a dataset group .
The top filter is used to filter-out the objects with selected names. For instance, to
display all horizons that start with letter D use "D*".
- 435 -
4.5.8.2 Horizon Manager 3D
The 3D Horizons manager can be accessed by the menu Survey > Manage > Hori-
zons > 3D or by the quick access icon > 3D Horizons.
- 436 -
Use the top filter to find the wanted element(s) by typing the name or a part of the
name (complete the name with *): for example, to find ‘Demo 2 --> FS6’, you can
type *FS6*.
The basic icons similar to the one from the general selection window are available
for the horizons management with some additional actions (see below).
Copy 3D Horizon
The copy window for 3D horizons differs slightly from the usual copy window. It is
indeed used to copy surface data and grids.
Merge 3D Horizons
- 437 -
To merge horizons, select the horizons to be merged. In case of duplicate position,
the action needs to be specified: take average, use top or use base. The duplicate
positions will then be handled in the following manner (dashed line portion rep-
resents removed data after merge):
Stratigraphy
- 438 -
The horizons can optionally be tied to a level, i.e. a regional marker (see below) by
clicking on the Stratigraphy button.
Stratigraphic marker can be assigned to one or more horizons. The horizons will
get the marker color, this will facilitate for example the well to seismic tie.
For more details on how to define stratigraphic markers and the subsequent units
go to Manage Stratigraphy.
Relations
The Horizon relation window is used to resolve conflicts between horizons cross-
ing each other. Read Horizons .... is used to select all horizons that need checking.
The horizons are then sorted automatically from top to bottom. The Check cross-
ings... button is used to automatically check the crossings between the listed hori-
zons and resolve them.
- 439 -
Solving crossing conflicts
To solve crossing conflicts select the horizon that will be modified. The software
will check the number of positions where a conflict exits and modify the horizon by
removing the conflict points or by changing the values to be equal to the overly-
ing/underlying horizon. In the example below, the checked horizons have been
found to cross in 9 positions.
- 440 -
To honor the requirement that horizons cannot coincide, the horizons actual pos-
itions are not exactly equal, but they are within one sample position accuracy.
If the lower horizon (red) is selected to be modified, the figure below sketches what
will happen to this horizon if you select shift or remove.
- 441 -
4.5.9 Manage Layer Properties
Layer Properties - Definition window is accessible:
l by clicking the icon in the Layer Properties - Selection window of the Layer Model-
ing module;
l via Survey > Manage > Layer Properties... menu.
- 442 -
- 443 -
l Available property types: a hard-coded list of available property types.
Please contact support if you would like to extend the list of Available
property types.
l Defined usable properties: a list that contains layer properties available in the cur-
rent OpendTect project. Some of the most commonly used properties are pre-defined
for a user (for example, Acoustic impedance property of the Impedance type).
The type of an existing property can not be changed. A new property of the
desired type has to be created instead.
directory (where all OpendTect surveys are located), applicable for all users and
all surveys.
o As default for my user ID only: properties are saved to home/.od file, which has
The only way to restore the default OpendTect list of properties is to delete
the Properties files at all levels.
l Add usable property: select a property type in Available property types list and
click on to pop up the Property definition window.
l Edit usable property: select a property in the list and click on icon (or double-
click on a property name) to pop up the Property definition window.
l Remove usable property: select a property in the list and click on icon.
The minimum possible list of properties must include at least one log for
each of the following types: Density, Velocity and Impedance.
- 444 -
l Name: a unique layer property name.
l Aliases (optional): specify possible aliases (useful to associate the correct log to a
property: logs with different names can thus be related to the same property).
l Default Display Color: a default color for a log display.
l Typical value range: a typical value range for a property with associated units.
l Default Value (optional, but recommended): type in a numeric value or click on For-
mula ... to set a mathematical formula in the Math property window (use RockPhysics
library to retrieve standard ones). Default Value is used to auto-fill property values
in layer definition windows (see Layer Modeling chapter), and the auto-filled value
can be changed for individual layers.
l Fixed definition (optional, but recommended for some properties: see the tip below):
type in a numeric value or click on Formula ... to set a mathematical formula in the
Math property window (use RockPhysics library to retrieve standard ones). A prop-
erty with Fixed definition doesn't appear in layer definition windows (see Layer Model-
ing chapter) as it is always auto-computed in the background.
- 445 -
should be preferably set for all properties and should be chosen such that
they roughly represent most of the modeled media. For example, specify
the default density corresponding to encasing shales and later in the mod-
eling workflow modify the auto-filled values only for target sand layers.
Fixed definition is recommended for the properties which are defined by
specific formulas (i.e. never modeled directly, irrespective of a geological
setting): Acoustic and Shear Impedances, Vp/Vs and Poisson’s Ratios,
Lambda-Rho, Mu-Rho, etc.
The example below shows Density with a Fixed definition using Gardner's empir-
ical relation from Sonic values.
Math formulas can be optionally saved for later use and restored via
and icons respectively.
- 446 -
- 447 -
4.5.10 Manage Pointsets & Polygons
Manage pointset/Polygon window is accessible:
All pointsets and polygons available in the current OpendTect project are listed
here. The object Type, pointset or Polygon, is given in the middle information area.
- 448 -
l Rename.
l Lock / Unlock (toggle read-only status on/off).
l Delete.
l Set as the default object of its type.
l Merge pointsets: several pointsets can be merged into one:
The top filter is used to filter-out the objects with selected names. For
instance, to display all pointsets that start with letter S use "S*".
- 449 -
4.5.11 Manage Probability Density Functions
Manage Probability Density Function window is accessible via Survey > Manage >
Probability Density Functions ... menu.
The manager lists all PDFs available in the current project, allows to view/edit
them and generate new synthetic PDFs with user-defined specifications.
- 450 -
Available actions on PDFs include:
l Rename.
l Lock / Unlock (toggle read-only status on/off).
l Delete.
l Set as the default object of its type.
l Browse/Edit this PDF.
l Generate PDF.
- 451 -
OpendTect supports both discrete and continuous PDFs.
Discrete PDF:
Continuous PDF:
- 452 -
- 453 -
The first icon right of the table ( ) launches a 2D viewer that displays the values
seen in the table in a coloured density display. If the PDF has 3 dimensions, the
left and right arrows may be used to navigate through the bins of the third variable
with increasing and decreasing values respectively.
- 454 -
The second icon ( ) (in Edit Probability Density Function) performs smoothing of
the PDF data. Weighted average of a central sample with 1/2 weight and N neigh-
bouring samples (excluding diagonal neighbours) each with 1/2N weight is cal-
culated at every bin, where N = 2, 4 and 6 for 1D, 2D and 3D PDF. This smoothing
is rather gentle, and can be repeated multiple times for a more pronounced effect.
- 455 -
4.5.11.1 Generate Probability Density Functions
User-defined
A user defined PDF can be generated by clicking on the bottom icon ( ) in the
Manage Probability Density Functions window.
Discrete Gaussian and discrete empty PDF can have up to 3 dimensions, while
continuous Gaussian can virtually contain any number of dimensions. Values of
discrete PDFs can be browsed, edited and smoothed after creation since they are
stored in tables. Continuous Gaussian PDF exists only in the description form, the
corresponding probabilities are computed on-the-fly.
The example below shows generation of a discrete Gaussian PDF with 3 dimen-
sions. Required parameters include dimension Names, Value ranges, Number of
bins per dimension, Expectations, Standard deviations as well as Correlation coef-
ficients between all dimensions (except for 1D). PDF is saved by specifying its
- 456 -
name and clicking OK. It can be browsed, edited and smoothed through the Man-
age PDF window.
The next example shows generation of a continuous Gaussian PDF with 5 dimen-
sions. Dimension Names, Expectations and Standard Deviations are specified in
the Distributions tab:
- 457 -
Correlations tab allows to define Correlations by selecting dimensions, setting their
correlation coefficient and clicking Add button. Existing correlation can be selected
from the list and edited by updating its correlation coefficient and clicking Set (Set
button will appear instead of Add), or deleted by clicking theRemove selected cor-
relation icon ( ). PDF is saved by specifying its name and clicking OK. Continu-
ous Gaussian PDF is stored only in the description form which can be edited
through Manage PDF window.
- 458 -
Windows for generation of 1D and 2D continuous Gaussian PDFs shown below
do not have Correlations tab
- 459 -
From Crossplots
Alternatively, a PDF can be created using the Cross-plot tool by clicking on the
icon in Cross-plot window . This icon launches a pop-up dialog that can be used
for selecting attributes in order to create PDFs.
The number of PDF dimensions can be set to 1, 2 or 3 by clicking More and Less
buttons. Note that all attributes from the Cross-plot table can be selected. Attribute
- 460 -
ranges are generated automatically to fit the extracted data distribution. These can
be edited before creating the PDF.
- 461 -
4.5.12 Manage Seismics
Poststack Seismic data should be managed from these windows. There are sep-
arate managers for poststack 3D and poststack 2D . Access these via Survey >
Manage > Seismics... or via the icon.
They all use common management icons on their right hand side:
Rename
Delete
The top filter is used to filter-out the objects with selected names. For instance, to
display all volumes that start with letter S use "S*".
- 462 -
4.5.12.1 Manage 3D Seismics
The 3D Seismic file management window lists poststack volumes loaded in the sur-
vey. Information related to the selected volume is displayed in the central field and
personal/survey-related notes can be added and saved in the bottom field.
- 463 -
Alongside the standard actions (chnage disk location, rename, remove etc), the
user may also Copy the volume to another volume (different size, format,
sampling rate, ...), Merge several overlapping or consecutive volumes and/or
Browse in the file.
- 464 -
4.5.12.1.1 Copy Cube
Any volume can be copied into a new volume. The Volume subselection
defines the selection of the input cube to be copied and the Format/Scaling sub
menu allows to specify how to store the new cube. Rectangular volumes are not
required by OpendTect. Therefore null traces are dismissed by default. They can
be added back with the Null traces > Add, within the inline/crossline range of the
input volume. Larger volumes can be obtained while using the Null traces > Add
option and the volume subselection menu.
- 465 -
If the input cube is multi-component (e.g spectral decomposition cube with different
components, right), an option will be available allowing the user to choose
between all available components. All components is the default setting.
- 466 -
4.5.12.1.1.1 Copy - Volume Sub-Selection
In all those processes, the output might be limited with respect to the available
input data. The limitation may be:
l A rectangular part of the survey, possibly with a larger horizontal and vertical stepout
l An area limited by an OpendTect polygon. The area within the polygon can be as
well by decimated horizontal by using larger stepouts
l A table of positions from an OpendTect pointset or from a text file. The text file should
contain inline and crossline values without header
l All: This last option will output the maximum number of trace with respect to the avail-
able data and possible stepouts
l The use of larger vertical stepouts will cause the data to be decimated in the given dir-
ection. Please note that an anti-alias filter (using the frequency filter attribute) should
be applied before decimating data. The copy-cube does not do it.
l The use of smaller vertical stepouts will cause the data to be interpolated with a poly-
nomial interpolation. This is mostly appropriate for seismic data.
l Volumes tagged as Vint, Vrms or Vavg are not using a polynomial interpolation of the
input amplitudes, as soon as Z start, Z stop and/or Z step are changed. Instead they
are converted to the corresponding time-depth relation that is linearly interpolated (ver-
tically), before back converting the interpolated TD function to the input type.
- 467 -
l The copy-cube option does not do lateral interpolation of the data (but it can decim-
ate). Use the Velocity gridder step of the volume builder to laterally grid a coarse
volume.
- 468 -
4.5.12.1.1.2 Copy - Format & Scaling
l Change between storage type. Please note that this might clip your data.
l Scale the values given a linear equation.
l Adjust the Z range to the survey range by repeating the bounding samples up and
down.
l Optimize the horizontal slice access. This will change the sorting mode in the volume
on disk, and will cause inline/crossline accesses to be significantly slower based on
the volume size.
- 469 -
4.5.12.1.2 Merge Files
The icon is used to merge sub-volumes into one single volume. OpendTect pro-
cessing time can be reduced by distributing automatically or manually batch jobs
over multiple computers.
When merging two cubes, the duplicate traces can be stacked when merging e.g.
two seismic cubes (the merging cube will reduce noise) or the traces of the first
cube can be used. Priorities are set in alphanumerical order, as the volumes
appear in the manager from top to bottom.
Select the input files from the multiple entry list and specify the Output file name.
The user can remove the original files at a latter stage (use the Remove button
in the seismic file manager).
- 470 -
4.5.12.1.3 Browse & Edit Cube Locations
Cbvs files can be browsed/edited (edit the cube locations, positions, trace
samples, etc) by pressing the icon. In the window that pops up (see below),
sample values can be changed by editing any cell (similarly to an MS Excel sheet).
Editing is disabled if the cube is write protected.
- 471 -
Several options are available:
- 472 -
4.5.12.2 Manage 2D Seismics
2D surveys in OpendTect are grouped in datasets. These datasets have their own
manager (shown below), separate from the Manage 2D Seismic Lines window.
- 473 -
In addition to standard rename/delete options, the following actions can be applied
on datasets: Copy all or part of the dataset to a new dataset, Access the 2D
Lines Manager (alternatively, double-click on the dataset name), Dump the geo-
metry (positions) to a text file.
- 474 -
4.5.12.2.1 Manage 2D Seismic Lines
Accessed via the 'Manage 2D Seismics' window, either by double-clicking on a
dataset name or via the icon.
- 475 -
Lines can be renamed ( ) and deleted ( ). The following actions can also be
made on the lines: Merge of several lines to a new line, Extraction (pro-
jection) from 3D volumes along the 2D lines, Export of the geometry to
GoogleEarth.
- 476 -
4.5.12.2.1.1 Merge 2D Lines
Two 2D lines can be merged together to create a single 2D line. The icon
opens the merging window. The merge can be either of 2 lines with about the
same geometry, or to append two (consecutive) lines to each other.
l Match trace numbers: Assumes the lines are at the same location, containing different
attributes to be stacked and referenced using the same trace number array.
l Match coordinates: Same as above, but with different trace number arrays. Then the
match will be based on coordinates, with a search radius to match the traces. Please
note that traces will be renumbered in this mode.
l Bluntly append. Append the line specified at the "Add" field line after the "First line".
Please note that traces will be renumbered in this mode.
- 477 -
4.5.12.2.1.2 Extract 2D Attributes from 3D Volumes
This extraction tool, started from the icon, can be used to project a 3D volume
onto 2D lines. This allows then to display the 3D volume along the lines, and to
use the data from the 3D volume with 2D lines in the 2D attribute set.
All lines may be processed, or a selection of lines made in the lines manager
before going to this window. The settings are trivial: the 3D volume must be selec-
ted and an attribute name must be provided.
Please note that the polynomial interpolation does not fit an application of this tool
to 3D seismic data.
- 478 -
4.5.12.2.1.3 Export 2D Geometry to Google Earth
This window (see below),launched with the icon, is to export the 2D lines geo-
metry in to a *.kml file. Different methods are supported (Start/End or both etc.) for
labeling the line-names in the Google Earth file. The line color field is also edit-
able. The width represents the thickness of the lines. The Output file field specifies
the output location and name of the exported file (Format - kml).
- 479 -
4.5.12.2.2 Copy Data Set
This utility window can be used to copy (backup) a data set.
Pressing on the select button of the first line will allow the selection of which line(s)
should be copied to the new data set:
- 480 -
4.5.12.2.3 Dump 2D Geometry
With this module, accessed via , an ASCII file with the geometry of one or sev-
eral 2D lines can be generated. By default, the output file contains trace numbers
and X/Y locations.
This export facility may be very practical if you want to generate a base map from
2D lines for a different software package.
- 481 -
4.5.13 Manage Seismics Prestack
Prestack seismic data should be managed from these windows. There are sep-
arate managers for prestack 3D and prestack 2D. Access these via Survey--> Man-
age--> Seismics Prestack... or via the icon.
They all use common management icons on their right hand side:
Rename
Delete
The top filter is used to filter-out the objects with selected names. For instance, to
display all volumes that start with letter S use "S*".
- 482 -
4.5.13.1 Manage 3D Prestack
This window is opened via Survey--> Manage--> Seismics Prestack--> 3D...
- 483 -
Most options are common to the other managers: change file location, rename,
lock, delete. 'Notes' may be anything of interest to the survey and may be added to,
edited and saved multiple times.
The options copy cube and merge blocks of lines work similarly to the 3D post-
stack seismic manager.
- 484 -
Prestack data stores are present on the disk in a folder of the same name within
the survey ("Seismics" sub- folder). This folder contains one file per inline for
quicker access, with auxiliary files. The manager will display information about the
entire prestack data store: Folder name, number of files etc.
This option may also be employed in creating a multi-component cube for attrib-
utes with more than one component. For example, using this option, a user may
create a multi-component Spectral Decomposition cube with each of the included
frequencies given a pseudo-offset value. In the example (below), a multi-com-
ponent Spectral Decomposition cube has been created, and for simplicity, the
pseudo-offset used is a multiple of the frequency component. [The actual value
used in these pseudo is irrelevant in this case, affecting only the width of the
- 485 -
prestack display (which can be altered by right-clicking on the prestack displayed
in the scene and choosing 'Properties...']
No new file is written to the disk. Therefore deleting a poststack volume used in the
prestack data store will cause problems. Please use same option to remove or
modify the previously set multiple volume selection.
- 486 -
4.5.13.2 Manage 2D Prestack
This window is opened via Survey--> Manage--> Seismics Prestack--> 2D...
Options found here are common to the other managers: change file location,
rename, lock, delete and set as default. 'Notes' may be anything of interest to the
survey and may be added to, edited and saved multiple times.
- 487 -
4.5.14 Manage Sessions
Sessions in OpendTect are generally used to save and to retrieve the specific set-
tings of a scene. This can help the user to resume work from previous settings.
These sessions can be managed via: Survey > Manage > Sessions...
Sessions will save all settings of the displayed elements, and they can be
saved/restored at any time from Survey > Session.
- 488 -
The following options are available:
Rename
Delete
The 'Notes' box is a free-text field where you may add notes related to the session,
if desired, and save them.
- 489 -
4.5.15 Manage Stratigraphy
The Manage Stratigraphy window can be launched with the icon from
OpendTect Manage toolbar or via Survey > Manage > Stratigraphy... This window
is designed to arrange the stratigraphic markers and the geological sub-units. It is
used as base for the Layer Modeling.
The first time you open the manager, a pop up window gives the options to either:
1) build a new stratigraphy from scratch or 2) to open an existing one (North Sea or
Simple Reservoir). These two saved stratigraphy description are saved by default
in another type of format. If edited, the edited version will be saved as classical
stratigraphy description. Once the selection has been done, it is set as default. To
re-access the selection window click on the icon to create/open a new descrip-
tion.
The user can create a specific information about the project and the different
regional markers of his/her interpretation. This window is organized as units/sub-
units bounded by different stratigraphic markers. Markers are assigned to the cat-
egory the most on the right of the stratigraphic column. Depending upon user's
description, markers can have the same name as seismic horizons or well markers
and the units the names of epochs/eras.
To start, the user has two ways to display the stratigraphy tree: the time view and
the tree view. The time view is chosen to display the absolute geological time
while the tree view shows an overview of unit/sub-unit as leaves.
- 490 -
Stratigraphy window: The time view
1. Regional Markers:
- 491 -
2. Stratigraphic Units:
On the left hand side of this window, the units are classified in a way that the top
and base of each unit belong to certain marker. For the initial unit, right-click on
<Click to Add>, the stratigraphic unit editor will pop up:
In this window, give a name of the unit area, the description, color, the age and
lithology.
The minimum requirement for creating a new unit is simply to define the name.
- 492 -
To add a lithology: Click on "Edit" then give the name, and optionally specify poros-
ity then Add as new, click on Ok.
To add a sub-unit, right-click the unit name and select Create sub-unit, and define
it in the same manner as a unit. Description and lithology of the unit can be added
now or edited later.
- 493 -
Stratigraphic unit properties: Properties such as unit/sub-unit description and
lithology can be defined or edited by right-clicking on the unit/sub-unit name and
selecting Properties. A unit/sub-unit specific lithologic name can be entered dir-
ectly into the Lithology field. For lithologies that may occur in multiple units/sub-
units, a lithology can be defined and made universally available by clicking the
Select button next to the Lithology field. In this Select Lithology window, the litho-
logy type can be named, and added to a list that will be made available for all unit-
s/sub-units in this session. (Depending on your Save settings, these lithologies
can be available outside of this session.) These options can also be defined when
the unit/sub-unit is first added.
- 494 -
Save as: The defined stratigraphy can also be saved at different levels, e.g. Sur-
vey levels, OpendTect data level, User level, or Global level. For instance, if it is
saved at Survey level, the stratigraphy will only be available for this survey. Altern-
ately, if it is saved at a higher level, it will not be limited to only the survey in which
it was defined.
This option links the regional markers with stratigraphic units. Right-click on bound-
ary or unit/sub-unit then click on Assign marker boundary select regional markers
top and bottom that are the appropriate boundaries for the unit/sub-unit.
- 495 -
4.5.15.1 Manage Lithologies
The Manage Lithologies window can be launched by clicking on the icon in the
main Manage Stratigraphy window. It allows to define the list of lithologies possibly
present in the stratigraphic column. This list is then available when defining the dif-
ferent units of the stratigraphy. For Layer Modeling, the lithologies listed for each
units are used in the layer description.
- 496 -
4.5.15.2 Manage Contents
Manage Contents can be accessed by clicking on the icon in the Manage Strati-
graphy window.
This option is used to define a set of fluid contents. Afterwards, fluid(s) from the list
can be assigned to lithologies for each layer when defining Layer properties for
Layer modeling.
- 497 -
4.5.15.3 Layers & Synthetic Modeling
The icon starts the Layer/Synthetics modeling feature.
- 498 -
4.5.16 Manage Wavelets
This window is available from the Survey > Manage > Wavelets... menu and from
the icon. It provides management tools for wavelets. The left panel shows the
available wavelets. The selected wavelet is visualized on the right panel. The stor-
age information of the active wavelet is shown in the lower panel. The following
actions can be performed:
Options:
- 499 -
Alongside the standard 'Manage ' options (Rename , Lock, Remove and Set as
Default), you may also, via this window:
Change polarity
Taper a wavelet
The Filter is used to filter-out the objects with selected names. For instance, to dis-
play all wavelets that start with letter W use "W*".
- 500 -
4.5.16.1 Import Wavelet
When clicking on the Import button, the import wavelet dialog box pops up. Please
follow the instructions in Import Wavelet section.
- 501 -
4.5.16.2 Generate Synthetic Wavelets
Generate a wavelet
- 502 -
4.5.16.3 Statistical Wavelet Extraction
Statistical wavelets can be extracted from the seismic data.
The User first needs to choose the input seismic, i.e 3D volume or 2D line.
- 503 -
It is recommended to use a sub-selection of the seismic data, e.g. every 10th
inline/crossline, and to use horizons to guide the extraction. The extract length of
the seismic data should be at least 1 second TWT.
The wavelet length should never be too small (min 50ms), or too large (200ms
max). A rule of thumb is that the first side lobe should be fully contained in the
wavelet.
- 504 -
The extraction is performed using the following workflow:
The output phase rotation cannot be set in the current version. It is being imple-
mented.
- 505 -
4.5.16.4 Rotate Phase
The phase of wavelets can be altered and saved using the following slider:
- 506 -
4.5.16.5 Taper a Wavelet in Time or Frequency
Domain
A wavelet tapering window is launched by pressing the icon from the wavelet
management window. A wavelet is tapered in time or a frequency domain, depend-
ing what is selected from the top of the panel (see below).
- 507 -
In time domain, the selected wavelet is tapered by selecting a tapering percentage
(%), which is set from the slider available at the bottom of the window. This is done
by moving the slider left or right. Additionally, the amplitudes at zero frequency can
also be muted by setting check to mute zero frequency check box.
- 508 -
4.5.16.6 Merge Synthetic Wavelets
Two or more wavelets can be stacked using this option. The wavelets can be
'Normalized' and/or 'Center' at maximum amplitude/energy.
- 509 -
4.5.17 Manage Wells
The Well Management window can be open from Survey > Manage > Wells... or
from the icon.
- 510 -
The available wells imported in the project are listed on the left. Wells can be
renamed, locked, removed or set as default with the buttons in the centre of the win-
dow:
Sets as a default
The well track, checkshot and time/depth model can be loaded and/or edited using
the lower left row of icons:
Checkshot Editor
The well log tools can be used to remove spikes, to smooth and to clip the logs for
the loaded wells.
Markers editor
Wells can also be exported to be seen on Google Earth using the icon. It is also
possible to either import or create Multiple single wells from the icon .
When selecting a well on the left list, the loaded logs for this given well are listed
on the right hand side. They can be renamed, removed and exported with the but-
tons on the right of the log list. The unit of measure of the logs can be checked or
changed if needed (this does not affect the log values). The logs themselves can
be moved up/down within the list. For removal and export, multiple logs can be
selected.
- 511 -
Renames the selected log(s)
Logs can be imported from Ascii files, click on Import..., or created using math-
ematical expressions from the well management window, click on Create....
It is recommended to give logs the same name in all the wells. For example, the
master density log should be called RHOB in every well. This enables the selec-
tion of one set of logs in all wells, ie: for use in the cross-plot tool. Please note that
logs names and marker names are case sensitive during multiple selections.
The top Filter is used to filter-out the objects with selected names. For instance, to
display all wells that start with letter W use "W*". This works only with text, not num-
bers or symbols.
- 512 -
4.5.17.1 Simple Multi-Well Creation
Multi-wells can be imported or edited through the Import > Well menu. This window
contains editable fields. The new wells can be created by either importing them or
entering directly the values and names. The Read file button can be used to import
an ASCII file containing all well information.
Select the input file (as shown below) and provide the appropriate format definition
settings. To provide the format definition, the selected input file can be examined
by pressing the Examine button. If the file contains the header lines, those lines
can be eliminated by providing the file header information.
- 513 -
The file format definition is provided by pressing the 'Define' button. In the format
definition window, the default 'col:0' values can be modified according the the input
file. When the correct file format is defined, the wells can be imported by pressing
'OK' button in the multi-wells creation window. By default the wells are loaded with
a constant velocity. The velocity data or the time-depth model can be provided
while importing the Time-depth model.
- 514 -
The simple multi-well file can now be imported and displayed after creation:
- 515 -
- 516 -
4.5.17.2 Well Track Editor
This table shows the import relation between the X and Y coordinates (first two
columns), the TVDSS depths (Z, third column) and measured depths in the fourth
column. This table is fully editable: Double-click on a cell to edit it, type a new num-
ber and press enter or select another cell. Please note that other values will be
recomputed to reflect the changes.
The "Update display" button allows to update the displayed well track in the scene
based on the modified table content. Optionally a whole new track file can be read
from a file to replace the existing data, like during track import.
During edition the depths can be displayed in feet. Please note that this flag will be
kept in the survey defaults and will apply in other windows. However it is only a dis-
play setting and the data on disk will not be affected.
- 517 -
The following window appears after having clicking on "Read new". The import set-
tings are fully similar to that of the import step.
- 518 -
- 519 -
4.5.17.3 Checkshot and Time-Depth Models
Wells in OpendTect can have two different types of Time/Depth models:
1. An optional checkshot model, often the first available time/depth model or a measured
checkshot survey.
2. The Time/Depth model, that is always the active time/depth model for the well, used
for data extraction and visualization.
Both types have a similar editing window. It shows the mapping between meas-
ured depths and two-way travel time, respectively in the first and second columns.
Depths are displayed either in meters or in feet (toggle at the bottom of the win-
dow), and times are always displayed in milliseconds. These tables are fully edit-
able: Double-click on a cell to edit it, type and new number and press enter or
select another cell. The "Update display" button allows the user to update the dis-
played well data (track, markers and logs) in the scene to be updated based on the
actual table content.
- 520 -
The following window appears after having clicking on "Import" in the edit Check-
shot or Time-Depth model window. The import settings are fully similar to that the
import step.
Please note that a user can expect to get less imported time-depth pairs,
as, in order to maintain a decent search speed in the imported table, the
software protects against duplicated time and/or depth values in the input
TD table by removing duplicated velocities. There should be no concern
about loosing time-depth pairs, as the underlying velocity function is kept,
although potentially converted to a more compact form. This compaction
greatly improves the performance of most search operations done on the
time-depth model. This applies for both OpendTect objects checkshot and
time- depth model, but in practice checkshot surveys hardly ever have
redundant data points.
The Export button allows to export the table in the same format to an output ASCII
file.
- 521 -
- 522 -
4.5.17.4 Manage Markers
Well markers can be manually provided or imported. They can be exported.
Add markers: Markers can be loaded from a file by clicking the Import button. The
following window will then be displayed:
- 523 -
Select the input Ascii file. The main work is to specify the presence of a file header
and the file format definition. The header, if present, can be of fixed length (number
of lines), or delimited on its last line by a keyword.
Predefined and saved file formats are available by pressing the Open icon .
Otherwise the format must be manually specified. The Define button gives access
to the format definition window.
- 524 -
You must specify in the format definition window the column numbers of the marker
name and depth. Please mind the spaces in the marker names that can break the
fixed column format. For that reason it is recommended to have the depth in the
first column, and to specify column 2 as the position of marker names. Then all
strings found in column 2 and up will be used to form the marker names. Depths
can be either measured depths or TVDSS depths. Data loading can be stopped at
a specific line by providing the adequate keyword.
It is recommended to save the format definition for a later use and QC, by clicking
on the Save icon . In pop-up window, write the name of the format and store it.
The format can be stored at different levels (All surveys, Current survey, Current
OpendTect user level) depending on the usage. Press Ok when done.
Update display: After adding and/or editing the markers, they can be refreshed in
the main display by clicking on theUpdate display button.
Export markers: Finally, the edited markers may be saved to a new file/location
by clicking on the Export button.
- 525 -
4.5.17.5 Logs Import
Logs can be imported from the Well Manger window.
Import Logs: The file should be in LAS format, with either MD, or TVDSS. Altern-
atively, the log files can be pseudo-LAS, meaning LAS (with one line of data per
depth value) with the header replaced by a one-line definition: "Depth Gamma
Sonic" etc (without quotes). Log names should be separated by blank characters
(space or tab). For both LAS and pseudo LAS, the following units can be recog-
nized. The recognition process is case insensitive.
Once the file has been selected all recognized logs will be listed in the Select logs
section. Only the highlighted logs will be imported. Be careful that two logs do not
have the same name. The depth interval can be limited to a subrange. The start
depth, stop depth and step written in the LAS files are not used; instead the depths
found on the same line as the amplitudes will be used.
- 526 -
In pseudo LAS, units should follow directly behind the log name in parentheses,
e.g. Depth(ft) Density(g/cc). Below are examples of text string that will match units:
- 527 -
4.5.17.6 Logs Creation
Logs can be created from log-log computations. Select one or more wells and click
on 'Create' in the well management to open the log creation window as shown
below:
- 528 -
There is an inbuilt list of functions which can be used for creating new logs.This is
supplemented by a Rock physics library containing more advanced resources.
- 529 -
The same syntax as the mathematics attributes should be used, with the following
changes:
- 530 -
4.5.17.6.1 Rock Physics Library
A number of in-built advanced rock physics formulas can be selected to create vari-
ous types of logs from available well log data.
The type of output log quantity is chosen from Property Type list (e.g. Density, Velo-
city and Pressure). Afterwards, a specific formula out of a number of possible altern-
atives can be chosen to compute the required log quantity. The choice of equation
depends on several factors such as the type of available log quantity and the
region (e.g. a rock physics equations might work well in Gulf of Mexico but not in
- 531 -
North Sea). After this selection has been made, the formula will be displayed
along-with a short description below it.
- 532 -
The standard values for variables of the selected formula is also displayed.
It is extremely important to keep in mind that the input log quantities (e.g. Sonic)
MUST be converted in to particular units for the formula to work. These units are
already selected by default and should NOT be changed. Same applies on the out-
put unit of measure as well.
- 533 -
4.5.17.7 Logs Export
All or a selection of logs can be exported to an output text file. The input well must
be selected, then select the logs to be exported, and click on the export icon .
The logs can be exported with respect to MD and TVDSS depths, optionally also
with X/Y or Inline/Crossline positions. The depth range and step will specify the
regular array on which the input logs will be interpolated prior to the export. The out-
put file will be a column sorted Ascii file.
- 534 -
4.5.17.8 Export Well Surface Positions to Google
Earth
The well locations (surface coordinates) can be exported in Google earth using the
icon. Select a selection of wells (CTRL-left click to select several wells) to be
exported in the popup window, and specify a filename for the kml file to be created.
Press Ok, and open this file in GoogleEarth.
- 535 -
4.6 Pre-load
In OpendTect, the user can pre-load seismics or horizons. The advantage is to
allow for faster display times in the scene. Your system must possess sufficient
memory to store the pre-loaded data.
- 536 -
4.6.1 Preload Seismics
The pre- load seismic data functionality is introduced inside the OpendTect to
speed-up the working efficiency with large volumes. A common practice is to add
an inline/crossline/2D line in the tree and display it in the scene. Each time the line
is displayed, the stored volume is read. Therefore, the seismic volumes that are
routinely used, can be preloaded. Also if the same seismic data like attribute cube
is pre-loaded, the efficiency of displaying the data in the scene is improved.
Seismic data can be pre-loaded in OpendTect by going to Survey > Pre-load >
Seismics...
Clicking on Seismics.., the pre-load manager pops-up to allow the user to Add, i.e
select the data to pre-load.
- 537 -
Additionally, after selecting a pre-load data, a user can optionally save the set-
tings for the later use. These settings can be opened when needed.
- 538 -
Note: The functionality is available for all stored seismic (2D/3D-Pre/Post) data in
the relevant seismic data manager interface.
Pre-load 2D 3D data
Load as: Data can be optionally pre-loaded in another format (e.g. to reduce RAM
usage).
Scale values: Choosing of a different format may require scaling the values to
ensure nothing is left out.
- 539 -
4.6.2 Preload Horizons
For fast visualizations, multiple horizons (2D/3D) can also be loaded in to the
memory. Thus whenever the preloaded horizons would be displayed in the scene,
it will take less time in reading the file from the disk. Therefore, the functionality
improves the visualization speed. Press the 'Add ..' button to select the desired hori-
zons to be loaded in the memory. If you want to unload some horizons, select the
horizons first and press 'Unload selected' button.
The Save and Open buttons are used to store/open the pre-loaded horizons
setup for a later use.
- 540 -
4.6.3 Preload HorizonCube
For fast visualizations, HorizonCube (2D/3D) can also be loaded in to the memory.
Thus whenever the preloaded HorizonCube would be displayed in the scene, it
will take less time in reading the file from the disk.
- 541 -
5 Analysis
5.1 Attributes
In OpendTect, seismic attributes are calculated/evaluated by using Attribute-set
Window . In this window, many single/multi trace, pre/poststack, dip-steered/non
dip-steered attributes are available. Moreover, it also contains special filters (e.g.
Gap decon, Frequency filters, dGB-special filters etc). The attributes are explained
individually in Appendix A.
- 542 -
5.1.1 Attribute Set Window
The attribute set window contains a set of seismic attributes definitions to be eval-
uated/calculated. While defining the attributes it is possible to work in the active
scene. Attributes can also be calculated after saving the attribute set. In broad
sense following workflows are applicable in OpendTect attribute calculation pro-
cess (on sections and horizons):
l Evaluate attribute
l On-the-fly attribute calculation
l Creating an attribute output (2D/3D)
- 543 -
OpendTect works with the concept of an "active" attribute set. At start-up, there is
no active attribute set. To create a new one (New set...), or to select an existing set
(Open set ...) , select the corresponding option from the File menu (See below).
OpendTect is also delivered with a collection of default attribute sets for some gen-
eral testing (Fault, Chimney,Salt default attribute sets). This set can be selected
from the Default set option under the File menu. To use a default set, the input seis-
mic data and a SteeringCube (if steered attributes are available in the default set)
have to be selected.
Clicking any attribute in the list will show its parameter settings. Notice that
OpendTect uses SI units. For details on each of the attributes see Appendix A.
Note that some of the parameter options depend on whether you are using 2D or
3D data as input. For example, the inline and crossline stepout field will be
replaced by a single trace stepout field. Generally, an attribute set can only contain
2D attributes or only 3D attributes. Mixed attribute sets are not possible.
When parameters of an attribute are updated, the modified attribute can be added
to the attribute set with a new Attribute name by clicking Add as new. Clicking on
any other attribute in the list means that the updated parameters are accepted,
while keeping the original attribute name. The Revert changes button only reverts
changes to the original state before clicking on another attribute in the set. When
Ok is pressed, the (updated) attribute set becomes the "active" attribute set. The
attribute set is saved to disk when Save on Ok is ticked. To save an attribute set
under a different name, use the corresponding option under the File menu.
- 544 -
Allows for the creation of a new attribute set
Open one of the default attribute sets (provided within the OpendTect package)
Save as...
File - Change input... can be used to change the input data of all attributes in the
"active" set simultaneously, which is useful in case, for example, a new seismic
volume has become available.
It is now possible to have an attribute set already open at start up using the " Auto
load Attribute Set " option in the File menu. This enables to choose the attribute set
which will be active the next time the survey is opened.
- 545 -
5.1.2 Attribute Set Toolbar
The attribute set toolbar comprises the following icons.
Also accessible via the File menu, these are, from left to right:
l New set clears the window to create a new attribute set. The attribute set name can
be specified when saving it (press OK, or select File - Save set menu option).
l Open set opens a previously saved set in the current project (from the directory
Attribs/).
l Open default attribute set. Filenames for input data must be re-specified.
l Import attribute set from another survey. Filenames for input data must be re-specified.
l Reconstruct set from job
l Save set saves the "active" attribute set in the Attribs/ directory of the current project.
l Save as set saves the "active" attribute set in the chosen folder
l Redisplay element with current attribute is used for direct display of the selected attrib-
ute on the active display element (shown in reverse video in the tree). The main
graphic interaction buttons and options remain active while the attribute set window is
open, so the active element can be changed. However no new element can be added
to the tree
l Evaluate Attribute allows the automatic variation and evaluation of attributes and
attribute parameters. If you have an "active" attribute and the current display element
is a slice (inline, crossline or Z slice) or a horizon, a new window will pop up where it
can be specified how to vary the parameters of the displayed attribute. For example,
Spectral Decomposition:
- 546 -
Here six slices are created, with time gates of [-4,4], [-8,8] etc.... Use the slider to
move through all the slices. When an attribute has been evaluated on a surface,
the parameter can be updated in clicking on Accept. Enable this by checking Store
slices on Accept.
As shown above, the "Evaluate Attributes" set contains a general selection of vari-
ous attributes. It is intended as a guide or starting point for a scan through the wide
range of different attributes and may be a starting point for an effective attribute ana-
lysis.
For more information on Evaluate Attribute, watch the tutorial on the OpendTect
YouTube Channel.
- 547 -
l Crossplot attributes allows to crossplot attributes from the current attribute set and
saved volumes. Multiple attributes can be selected. The attribute values are extracted
at picked locations (see how to create a new pointset). Once attribute values are cal-
culated, a crossplotting table is generated and crossplot(s) can be achieved.
- 548 -
5.1.3 Auto-Load Attribute Set
By default, no attribute set is loaded at startup. These settings can be over-ruled by
selecting a specific attribute set to be auto- loaded in the list each time the
OpendTect window is started. This can be set from the attribute set window under
File > Auto Load Attribute Set sub-menu. If selected, it will launch the auto-load
attribute set window. Selecting Yes will show the list of attribute set that can be be
auto-loaded. Select one attribute set and press Ok button. This will save the set-
tings and next time, whenever the OpendTect is started, the selected attribute set
will be auto loaded. Such practice becomes useful when working with attributes
evaluation at different stages of a project and that the same attribute set need to be
updated.
- 549 -
Load Now will directly load the selected attribute set. If not selected, the attribute
set will be loaded the next time the survey is opened.
- 550 -
5.1.4 Default Attribute Sets
OpendTect is provided with "Default attribute sets" to get you started. By selecting
a default attribute set, a window appears to select the correct input volume(s) and
the correct SteeringCube (see images below). These attributes (except "Evaluate
Attributes") require the following dGB plugins:
The OpendTect version comes out with new "default attribute-sets" in addition to
the already existing attribute sets like NN ChimneyCube, NN SaltCube, Unsu-
pervised Waveform Segmentation, dGB Evaluate Attribute, etc .
l Evaluate Attributes: This default attribute set contains the default definitions of sev-
eral basic attributes grouped together to give an idea of attributes evaluation in
OpendTect. This default attribute set can be selected to start with OpendTect. After
selection, only input seismic data is required.
l dGB Evaluate Attributes: This default attribute set is similar to above attribute set
with additional dGB attributes (using dGB plugins). For this set, both seismic and
steering data are required as input.
l AVO attributes from Intercept-Gradient: This attribute set requires two inputs; Inter-
cept and Gradient (from AVO analysis) as first and second inputs respectively. It com-
putes attributes like Envelope, Fluid Factor and Rp-Rs.
l AVO attributes from Near-Far: The input for this attribute set are the Near and Far
stacked data sets in the same order. This includes attributes like Envelopes and
Enhanced pseudo gradients.
l Dip-steered median filter: This default attribute set contains the definition of dip-
steered median filter. It cleans up the seismic data by removing random noise. Both
seismic and steering data are required as input.
l Dip-steered Diffusion Filter: This filter is mainly used to sharpen faults. Both seismic
and steering data are required as input.
l Fault Enhancement Filter: This type of filter is used in the Fault/Fracture analysis, it
dramatically sharpens the faults by suppressing random noise. It is a combination of
the diffusion filter and the dip-steered filtered. Both seismic and steering data are
required as inputs.
l Fault Enhancement Filter (Expert): This is a more sophisticated version of the basic
Fault Enhancement Filter and uses similarity and dip-steered filtering. It also requires
both seismic and steering data as input.
- 551 -
l Ridge Enhancement Filter: This filter detects lateral lineaments using different
steered similarities (in inline, crossline and diagonal directions)
l Ridge Enhancement Filter (Expert): This filter is an advanced version of the above
described filter and uses steered similarities in addition with their second derivatives
(in inline, crossline and diagonal directions).
l NN Fault Cube: dGB standard default attribute set containing the definitions of all
attributes that are used in neural network (NN) training to create meta-attribute i.e. NN
Fault Cube.
l NN Chimney Cube: dGB standard default attribute set containing the definitions of
all attributes that are used in neural network (NN) training to create ChimneyCube
(meta-attribute).
l NN Salt Cube: dGB standard SaltCube meta-attribute.
l NN Slump Cube: dGB standard SlumpCube meta-attribute.
l Unsupervised Waveform Segmentation: attribute set containing the definition of
attributes that are used in unsupervised waveform segmentation (a.k.a UVQs).
l Seismic Filters Median-Diffusion-Fault-Enhancement: This is an advanced ver-
sion of the "Fault Enhancement Filter". It enables the user to have much control on
the input parameters by modifying the parameters of the dip-steered median filter, dip-
steered diffusion filter and fault enhancement filter.
l Fault Enhancement Attributes: expandable attribute set containing the list of the
attributes that are useful for fault visualization and fault interpretation.
l NN Fault Cube Advanced: most superior FaultCube (meta-attribute) attribute set that
is used as input for neural network training to create fault probability cube.
- 552 -
Default attribute sets window containing the list of all available default attributes
- 553 -
When one of these default attribute-sets has been selected, a window pop-ups
(see image below) to select the input seismic and optionally a steering (the attrib-
ute sets based on AVO analysis, the Fault enhancement filter and the Ridge
enhancement filter require inputs as outlined in their respective descriptions).
- 554 -
5.1.5 Input Selection
Every attribute requires input data. Both stored data and already defined attributes
can be used as input to a new attribute. In other words, attributes can be embed-
ded. However, circular references are not possible.
- 555 -
5.1.5.1 Input Selection for 3D Attribute Sets
Select from the stored data or from the list of defined attributes in the "active" attrib-
ute set.
- 556 -
In case the input data is multi-component, it is possible to choose from the avail-
able components as shown below (or include ALL).
The Filter section allows to quickly find the right input. e.g type *S will look for all
attributes/cubes started with S like Similarity.
- 557 -
5.1.5.2 Input Selection for 2D Attribute Sets
Select from the stored data or from the list of already defined attributes in the cur-
rent attribute set.
- 558 -
If the selected stored data set is multi-component, the user will get an option to
choose which component to select as input data:
- 559 -
5.1.6 Import an Attribute Set from
Attributes are primarily stored in attribute set files of extension .attr Attribute defin-
itions can also be found in the parameter files of a processing job when an attribute
was used to process a volume or data set.
It is possible to import the attribute set of an attribute file from the menu: File >
Import set from file. Existing attributes are stored in the Attribs folder of each sur-
vey. Optionally, attributes from another survey may also be imported: File > Import
set. or by using
- 560 -
It is also possible to re-create the attribute set of existing processing file in the
menu: File > Reconstruct set from job file... (or ) brings the user here:
Existing jobs are stored in the Proc folder of each survey, with the extension par.
There are two options available to reconstruct the attributes definition: from an
- 561 -
existing par-file or from a created cube file. In first case (from par-file) select the
input parameter (*.par) file. In second case (find from created cube), another win-
dow pops-up in which the input volume and the corresponding parameter file are
selected. The file name is found automatically.
Pressing 'Find from created cube...' will open up this search/select window:
Finds the attribute set from an existing (created) cube, which was calculated inside
OpendTect.
When importing, new input volumes must be selected to replace the references
stored in the input files.
- 562 -
5.1.7 Calculate Attributes
The attribute evaluation process has been considered critically and thus several
key options are available for the user. For instance, Evaluate Attribute is con-
sidered as intermediate (but not necessary) step to quickly analyze the different
parameters of any attribute within the working environment (View tutorial-Evaluate
attributes). Similarly, the user can create a list of seismic attribute definitions as a
working set that later on can be updated. The attribute set is then used to calculate
the seismic attributes along lines/surfaces.
There are two possible ways of calculating seismic attributes in OpendTect: First,
in order to calculate the results of any attribute in the foreground, user(s) can do it
on-the-fly. Second possibility is to calculate attributes after evaluation by running a
secondary process in the background. In OpendTect, seismic attributes are applied
on several elements (inlines, crosslines, Z slices, random lines, 2D lines, volumes,
horizons etc).
1. Define (or use existing) attribute Set and save. For details see earlier sections of this
chapter.
2. Calculate on-the-fly or Create Seismic Output or Create horizon attribute output.
3. If attribute is not calculated on-the-fly, retrieve results by displaying attribute in tree.
The example of first step is given in following figure. It highlights the sequential pro-
cess (notice green arrows form left to right) of on-the-fly attribute calculation. Firstly,
several attributes are defined. Secondly, by default, when a user presses Ok button
in Attribute Set window, the Save Attribute definition window will appear to save
the attributes definition as an Attribute set. The attribute can then be applied on an
inline (for instance) by adding a blank attribute (right-click on inline number). Right-
click on the blank attribute and select the attribute (Select Attribute > Attributes >
"User Attribute"). The listed attributes are those that are defined in the attributes set
window. Selection of any one, would start a process of on-the-fly attribute cal-
culation. By following same workflow (as elaborated in figure) the same attribute
can be calculated along other elements (e.g. crosslines, Z-slices, volumes etc).
- 563 -
Schematic flow of on-the-fly seismic attributes evaluation on an inline.
1. There are some attributes that can take too much time during the on-the-fly calculation
process.
This depends upon the type of attribute that how much calculation steps it con-
siders e.g. multi-trace (e.g. Similarity) attributes normally take more calculation time
than the single trace (instantaneous) attributes. Similarly, the attributes with steer-
ing normally takes more time in calculation. So, each time the attribute is displayed
in the scene (as shown above), it is calculated in the fore-ground. If the user is com-
mitted with the attributes results, this can be resolved by creating seismic outputs
(See Create Seismic Output and Create Horizon Output sections) in the back-
ground. This will also help to restore the saved sessions quickly.
Another example of second step, is shown in following figures. The attribute can
be calculated along the horizon by following the same steps described above, for
inserting and displaying the attribute (as shown below). In this example, Similarity
is calculated on-the-fly along a horizon. This attribute normally takes time
- 564 -
(depending upon amount of traces involved). So, user can take benefit of saving
the on-the-fly results that later on can be retrieved.
In order to save the calculated attribute as horizon data, right-click on the attribute
and select Save as Horizon Data.... In pop-up window edit the name accordingly
and press Ok. This will save the horizon attribute as its horizon data. That can be
managed later on by using horizon management window. (see Management hori-
zon)
Result of calculated similarity attribute. Saving the horizon attribute as horizon data
The stored attributes along horizon can be retrieved as horizon data. Right-click on
horizon and add blank attribute. Right-click on the newly inserted blank attribute
and locate Horizon data item in the sub-list of attribute (as shown below). In the
horizon data selection window, select the desired attribute. This will display the
selected attribute in the scene.
- 565 -
Retrieving the stored horizon data (attribute) of a horizon.
- 566 -
5.2 Volume Builder Setup
The volume builder setup is used to apply volume-based operations, unlike the
attributes that work trace- by- trace. The setup is launched via the Analysis >
Volume Builder menu. The setup is a very useful tool for gridding velocities or
other rock properties.
The volume builder setup window contains several available steps that are applied
sequentially to generate a volume. Any particular step can be selected from the
Available steps by double-clicking on it. The later steps may replace the earlier
ones, therefore care must be taken when ordering and setting up the workflow.
Once your workflow is defined, you will need to save this setup. This can be done
by writing the name of your setup.
The computation and storage of a volume processing setup can be found under
OpendTect's Processing menu: Processing > Create Seismic Output > Volume
Builder...
- 567 -
5.2.1 Body Shape Painter
The body shape painter is used to define the inside and outside values for an
OpendTect body. The options for 'Inside Fill Type' are shown in the image below.
The same options are available for 'Outside Fill Type':
Fill Types:
- 568 -
Previous step: takes the values of the previous step in the Volume Builder setup
(ie: a stored volume):
- 569 -
One example of use would be: if one wants to create a salt velocity cube, the val-
ues inside can be filled with a salt velocity and outside can be set to previous step.
If no other step exists the undefined value is written.
- 570 -
5.2.2 Lateral Smoother
The lateral smoothing is a rectangular two-dimension smoothing filtering method of
the volume.
The average filtering will be done in the frequency domain by applying a 2D FFT
to the Z slices. This requires a rectangular dataset, while the input can be irregular.
The filtering type can be chosen between "Median" or "Average" which can option-
ally be "Weighted". Positions without data or with undefined values are first
replaced by the option "Mirror edges" and "Undefined substitution". The
"Undefined substitution" can be done by taking Average values between the
defined points, Fixed value (one value needs to be specified which will be used
everywhere) or using Interpolate between the defined points.
- 571 -
- 572 -
5.2.3 Smoother
The smoother step is used to apply a three dimensional smoothing operator by spe-
cifying In-line, Cross-line and Vertical (ms) stepouts for 3D or two dimensional
smoothing with Trace and Vertical (ms) stepouts in case of 2D.
Various operator shapes can be chosen (e.g. Hamming) and can be visualized by
pressing on the View button. The CosTaper also requires specification of a "Taper
Length (%)".
- 573 -
- 574 -
5.2.4 Horizon-Based Painter
The horizon-based painter is used to create a model between two surfaces. The ini-
tial top and bottom values are necessary to be filled in the input. The intermediate
values are interpolated to a survey or a horizon. In this window, horizons have to
be selected as top/base values. The slope type is used in interpolation to define a
slope.
The painted velocities are referenced to a specific time. This time can be either con-
stant (user-defined), or retrieved from a horizon and not necessarily from one of the
- 575 -
horizons defining the limit of the body. For example, an intermediary horizon could
be used.
Then velocities are painted from that reference time. The velocity must be provided
as a velocity/gradient pair. The values are once again either user-defined or extrac-
ted from a surface data (grid) attached to a horizon.
- 576 -
5.2.5 Velocity Gridder
The gridding will create a volume out of a sparsely sampled dataset. The input
source MUST be tagged with a velocity type. Indeed the gridding is applied to
the time-depth relation hold by the velocity source and not on the amplitudes of the
velocity source. This preserves the time-depth relation and blocky-ness of interval
velocity models. The gridded time-depth relation is converted back to the input velo-
city type in the output volume.
However any other data type could be gridded by this module (Thomsen para-
meters, temperatures, ...). These other types must be tagged as delta, epsilon or
eta before being used for gridding. The functions will be vertically interpolated
using a linear 1D interpolation before the lateral gridding.
Two interpolation methods are available for gridding: inverse distance interpolation
or triangulation. The first method is designed for the interpolation of sparse dataset,
while the second algorithm should be preferred if the input exists on a regular (but
coarse) grid. In general gridding is always followed by some filtering.
- 577 -
The input data can be either a (coarse SEG-Y imported) volume, stored (ascii) func-
tions or velocity points (requires the Velocity Model Building plugin)
Clicking on the 'Properties' button will allow you to change the selected input for
this step:
- 578 -
Velocity volumes have to be tagged to recognize their type.
Tip: If you have no velocity volume available, press 'Add' in the 'Edit Step' window
and then, in the window that pops up (shown below), press 'Create':
Here you may tag a volume with a velocity type, so that it can be used as input for
the gridding step. Or change the tag that the volume has. This can be useful for
interpolation of velocity cubes (for example, we strongly advise against trying to
interpolate Vint or Vrms, but Eta-tagged volumes can be interpolated):
- 579 -
- 580 -
5.2.6 Input Volume
This step is in general used to provide a background volume or 2D data attribute
before using spatially constrained steps, for instance before the body shape
painter or horizon-based painter.
- 581 -
5.2.7 Voxel Connectivity Filter
Voxel Connextivity Filter is a special tool to create continuous bodies based on the
amplitudes in a stored volume. A 'voxel ' is defined as the volume around one
sample. It is thus linked to the survey bin size and sampling rate.
This volume builder step must be preceded by a step providing the necessary
input data, like "Stored volume".
This volume builder step implies a volumetric calculation. The result of the applic-
ation on a single inline will differ from the result of the application to the whole
volume.
- 582 -
The voxel connectivity filter has a number of parameters to set:
Keep: Specifies the part of the input dataset used to compute the bodies, based on
their amplitudes.
l Values more than: The envelope of the amplitudes higher than the given value define
the bodies to be computed. Example: 0 will select all positive amplitudes.
l Values less than: The envelope of the amplitudes lower than the given value define
the bodies to be computed. Example: 0 will select all negative amplitudes.
l Values between: The envelope of the amplitudes between inside the given range
define the bodies to be computed. Example: 9000, 14000 will select all values in
between, like 12000.
- 583 -
l Values outside: The envelope of the amplitudes between outside the given range
define the bodies to be computed. Example: -10000, 10000 will select all values
lower than -10000 or larger than +10000 (the extremes).
- 584 -
Connectivity: Selects the method used to connect different voxels when com-
puting the bodies. Each sample in the input volume acts like a seed.
l Common Faces (6 neighbours): The propagation is done by strictly using the 6 faces
adjacent to the current seed.
l Common Edges (18 neighbours): The propagation is done by using the 6 faces and
the 12 edges adjacent to the current seed.
l Full (26 neighbours): The propagation is done is all directions, using the 6 faces, 12
edges and 8 corners adjacent to the current seed. This is the default mode.
The easiest way to visualize the connectivity is to imagine the reference voxel as
the central voxel in a 3x3x3 cube, such as the red one in the first image. Then the
second image shows the Face-, Edge- and Full (corner) conecctions:
- 585 -
Keep bodies larger than [voxels]: It defines the minimum number of voxels
required to output a body. Actually all bodies are computed in the first pass. The
smallest bodies are then dismissed. Minimum allowed is one.
Keep output: The following value(s) will be output on the samples inside the com-
puted bodies:
Body-size rank:The output value is an integer with a constant, different value for
each body. The values are sorted by decreasing body size, starting at zero: 0 is the
largest body, 1 the second largest...
The example below is created using a similarity attribute to locate faults and frac-
tures in a volume. It is set-up to create bodies connecting low similarity values
(threshold of 0.5). All values that are above this threshold are ignored. Fur-
thermore, it is also ignores the very small bodies (size < 10 voxels).
It shows number of connected bodies (purple being the largest ones) in a volume.
Such a result can directly show which faults are connected and those that are not.
Visualizing such a VCF result can be a valuable method in performing direct inter-
pretation.
- 586 -
Body-size: The output value is the size in number of voxels of each body. This
gives an approximation of the real-world volume, when multiplying by the bin size.
For example, a body of 2500 voxels (10 inlines, 50 crosslines, 5 samples), with a
bin size 25m x 12.5m, at 4ms sampling with a constant velocity of 2000 m/s: Vol =
2500 * 25 * 12.5 * 2000*0.004/2 = 10000 m3...
In this second example, the same volume is being processed for Body-size. It
shows the same patterns suggesting that the prediction is identical to the earlier
result. However, the predicted voxels are being filled differently. Here the same
bodies are defined by largest volume in cubic meters (m3).
- 587 -
Generally speaking, areas of higher faults/fractures density allow greater con-
nectivity between bodies.This example below shows this case.
l Value: The output value is a user-defined value specified in the "Kept value" field
underneath.
l Transparent: The output value is taken from the amplitude in the input volume.
Rejected output: The value outside the computed bodies can be either the
undefined value or a user-defined value specified by the field "Rejected value"
underneath.
Name for this step: Provide a user-defined name for this volume builder step that
will appear in the Used-steps list of the Volume builder.
- 588 -
5.2.8 Well Log Interpolator
This gridding step is used to populate a 3D volume using well logs by interpolating
along Z-slices.
- 589 -
Vertical Extension: Select the method of vertical extension from the following
options:
l Log extension if needed: Extend the logs (if required) to match the Selected zone
l Extract Between: Extract data from a marker-defined, depth-defined or time-defined
range. You may also toggle on the option to extract the data in time.
l Selected zone: Set the extraction zone suing either markers or start and end of data
(or combination thereof).
l Distance above/below: Extend, if desired, the extraction zone above and below the
selected zone.
l Algorithm: Choose between inverse distance or triangulation.
l Search radius: For inverse distance only - set an optional maximum search radius
for the algorithm.
After the selection of well(s) and log, parameters and algorithm, provide a name for
this step at the bottom and proceed to the Volume Builder by pressing 'OK'.
- 590 -
5.3 Cross-Plot
The Cross-plot tool is designed to create two dimensional cross plots between
2D/3D seismic data (attributes) and either other attributes or well data. The data
can be analysed in multiple maners, using different kinds of colour coding and data
selection tools. It may be launched from the menu Analysis --> Cross-plot menu.
- 591 -
5.3.1 Cross-Plot Data Extraction
The crossplot data must first be extracted, either on (a subset of) the horizon or
along (deviated) well paths. 2D or 3D attributes can be used, and well logs if the
extraction is done along the well paths. The extracted data will first be presented in
a table before actually selecting the features to cross-plot.
The extracted data can be saved in the cross-plot table window and reopened
without repeating the data extraction, from the menu Analysis --> Cross-plot -->
Open.
- 592 -
5.3.1.1 Well-based Data Extraction
This window presents the attributes and/or logs that can be extracted along well
path. The output will be presented in table before being used for cross-plot. At least
one well must be selected, and one attribute or one log. It is also possible to select
only attributes, or only logs.
The well track and time-depth model provide the locations where to extract the
data. Values will be vertically extracted along a specially built measured depth
axis. This axis is such that the step between two consecutive depth samples is con-
stant but with few jumps, such that the Z difference (time or depth depending on the
survey type) between consecutive depths is around the survey default sampling
rate. Therefore at shallow level 1 seismic sample can correspond to 4 meters, then
8 meters at intermediate depths, 12, 16 and so on.
l Attribute values are vertically interpolated along that created MD axis, since they are
unlikely to be along the Z axis defined by the survey geometry. A polynomial inter-
polation is performed.
l Log values are extracted in the depth domain around the depth to be computed, plus
or minus half of the distance to the previous and next depths. All collected values are
then processed (up-scaled) using a provided "Log resampling method" (see below).
- 593 -
3D Data extraction for Well vs. Attributes Cross-Plot
- 594 -
2D Data extraction for Well vs. Attributes Cross-Plot
Extract between: It is used to limit the z-range (depth or time) of the data to extract.
There are three options supported: Markers, Depth and Time.
- 595 -
If Depth is selected in the extract between field, the start/stop (m) field will be
toggled on. In the later fields, starting and stopping depth range is typed in to
restrict the data extraction into an interval.
Similarly, if Time is selected in the extract between field, the start/stop (ms) field
will be toggled on.
Finally, the step-out for extracting both Attributes and Well data samples has to be
defined. This can be defined in meters (default), feet (if the survey is in feet) or mil-
liseconds. The Extract in time check box is usually toggled on if you want to define
the data extraction step-out in TWT. It is advisable to check this box when extract-
ing data for crossplotting against seismic volumes.
l Distance above/below: It is used to modify the vertical range of the extraction win-
dow using a relative distance from the provided well markers, in depth. A negative
number will decrease the extraction window, a positive number will increase it.
l Log resampling method: Logs will be up-scaled using this method. 'Average' should
be used for most of the logs. Median, most frequent and nearest sample are more fit
for discrete logs like lithology, but can also occasionally be used for other types.
l Radius around wells: All traces that can be reached within the search radius will be
extracted. If several traces around a well are found, the same extracted log value is
posted in front of the collected attributes values. This option will only duplicate all
data if no attributes are extracted. The default value is the survey bin size, use value
"0" to extract only the nearest trace, i.e. one value per well per depth.
l Filter positions: See the location filters section in the same chapter.
- 596 -
5.3.1.2 Attribute-based Extraction Window
Attributes based data extraction window is used to extract an attribute data (stored
volumes or a defined attribute) within a volume defined by a range, polygon, sur-
faces, body or a well path with lateral extension. The same window (shown below)
could also be used to extract an attribute data along a time slice or along a surface.
Note that at least one attribute must be selected prior to data extraction. "Attributes"
list shows all attribute currently loaded in the window, and the stored volume under
brackets. For multiple attributes selection, use the left mouse button by holding and
dragging it up/down-ward. For 2D data extraction one or more attributes along-with
their corresponding "Line names" (at least one) should be selected.
- 597 -
3D data extraction for Attributes vs. Attributes Cross-plot
- 598 -
2D data extraction for Attributes vs. Attributes Cross-plot
l Range: It is selected to extract the data on a regular 3D volume or a grid (if it is a time
slice). The steps are the increments in the corresponding range. To extract dense
data points for a crossplot, smaller steps should be used. Note that the larger steps
will decrease the amount of extracted data. Cubes coarser than the requested grid
will not be interpolated, undefined values will be posted to the cross-plot table
instead. For 2D data extraction this is the only possible option and only the time range
- 599 -
can be specified.
l Polygon: The lateral extent for a crossplot data could also be defined by a polygon.
Once the Polygon option is chosen, the desired polygon is chosen by pressing the
Select button. The inline/crossline steps are the increments in the inlines/crosslines
range within the polygon. The time range is also an additional constrain in data extrac-
tion that defines a vertical restriction window for a polygonal type of volumetric cross-
plot.
l Table: Only positions listed in a table will be used for the extraction. The table might
be an OpendTect pointset, or a column-sorted ASCII file with inline, crossline and Z
values in the first three columns respectively.
l Surface: Used for data extraction along a 2D/3D horizon, or between horizons.
Please note that the attributes will be interpolated if extracted along an horizon. If the
extraction is done between two horizons (volume based extraction using a user-
defined Z step), it will not interpolate the attributes. The "Extra Z" values increase or
decrease the extraction window size, and work similar as the attribute set time gates
(relatively). The left value applies to the top horizon and the right value applies to the
base horizon.
l Body: It is used to restrict the data extraction within a selected 3D body. The radio
boxes inside/outside are used to extract the data either inside or outside the selected
body. If it is outside, the further ranges are sub-selected in the Within bounding box
field.
l Well: It is used to extract attribute data along the selected well paths. The data would
be extracted vertically and according to the (TWT/depth dependent) survey setup.
The time gate is defined by providing a time range with time steps (vertical sampling
rate).
l Location filters can be added in order to add one or several restrictions to the area of
extraction.
- 600 -
5.3.1.3 Location Filters
The filters should be used to further limit the amount of data to be extracted for mak-
ing cross-plots. Several filters can be used simultaneously.
The first two filters, Random and Subsample, are not position related. The Random
filter passes a certain percentage of random samples selected in the main extrac-
tion window, while the Subsample filter will pass a finite (user-defined) number of
samples. For instance, for the Random option, if the value is 1, only 1% of all
extracted data would be selected for cross-plotting.
The last four - Range, Polygon, Table and Surface (see previous section for defin-
itions) are position based filters. These are used to define sub-areas that will
- 601 -
complement the extraction settings provided in the main extraction window. Mul-
tiple filters can be chosen out of these. Thus, the points satisfying the main extrac-
tion settings and all defined filters will be used for the extraction of attributes.
- 602 -
5.3.2 Cross-Plot Table
The crossplot table displays the extracted data. It is used to edit and plot the data
for a crossplot. A row in the table corresponds to one extracted data point, annot-
ated by its position (X, Y and Z) and followed by the collected attributes values
(forming at vector, from left to right: logs, attributes, stored data). The star adjacent
to an attribute name indicates sorted column. Empty cells represent attributes than
could not be extracted with the provided the settings for data extraction.
The table enables the manipulation and edition of the collected data, prior to mak-
ing cross-plots. For instance, it is possible to sort the data from an attribute, and to
delete the first or last rows, before plotting the data. Please note that the table win-
dow is interactively linked with the cross-plot window. Any editing done in the
cross-plot window will reflect in the table window that remains open and active
while working in the cross-plot window.
Standard workflow: Save , edit, click in a column and then , click in another
column and then , launch the cross-plot window .
You can select a column by either clicking on its title cell or by clicking on any
single cell.
- 603 -
Saves the data shown in the table to a file (simple text file or OpendTect object).
OpendTect object is a special format to retrieve (open) the cross-plot. The format is
called position vector data and the data is saved in the survey sub- directory
(/Features/*.pvds). The Text file selection outputs the data to an ASCII (column sor-
ted) file that later on can be used in 3rd party software e.g. Excel.
- 604 -
Removes the selected rows in the table.
Moves the selection of the primary Y-axis one column to the left. The cross-plot
gets updated accordingly.
Moves the selection of the primary Y-axis one column to the right. The cross-
plot gets updated accordingly.
Add an empty column in the table. In the pop-up window, the column name is
provided. The mathematical operation is done to compute the data for the new
column. An example is the Acoustic impedance data computed from the velocity
and density logs available in the crossplot table. For further information on the
mathematical operators, please find the description on the Mathematics attribute in
the Appendix.
- 605 -
- 606 -
Variogram analysis: Set parameters (left), main window (right) : blue = real data,
green = model
A synthetic variogram can be set in changing the sill, the range and also the vari-
ogram model (exponential, spherical, gaussian). The objective is to get a synthetic
variogram that best describes the real variogram. The data can be analyzed for
each well or for all of them. The analysis results can be used when achieving inver-
sion.
- 607 -
5.3.3 Cross-Plot Window
The cross-plot window shows the data previously extracted and shown the cross-
plot . The window may start empty if insufficient data was selected in the table win-
dow.
Gives access to the main crossplot window properties: scaling, statistics, regres-
sion line, density plot parameters.
- 608 -
Creates multivariate from the cross-plot data.
Colour code the points with respect to the wells they were collected.
Used to Toggle on/off the second Y-axis (Y2) scattered points. It may be noted
that when the second Y-axis (Y+) is selected to be cross plotted against Y1 and X,
the data points may become too large to be displayed. Therefore, in this case, the
system will prompt a warning to display a given percentage of the data (% points
displayed).
- 609 -
Is inactive, if a cross- plot is created with one Y- axis. It
allows to make section of the scattered points. The selection settings (Select only
Y1/Y2 , both) are important to remove unwanted points from the extracted data.
When dual Y-axis are cross-plotted, a user can select individual or both Y-axis
points by changing this option.
Used to display the selected scattered point in an active scene. The selection
of the data points is done by using selection mode. By using this option, the selec-
ted scattered data can be saved as a pointset/body. Right-click on a point in a
scene and from the pop-up menu select 'save as a pointset' or 'create body' option.
Used to de-select the selected data points (using selection mode tools).
The unwanted data points can be removed by using selection mode and this
trash button. In order to remove the data points of Y1, Y2 or both, use selection
mode tools to select an area within which the data is to be removed. After that
press this button to remove the data.
Used to select the data from a cross-plot and display the corresponding selec-
ted data in the spread-sheet (Well/Attribute data window).
- 610 -
This option is used to by using a mathematical logic over a range. It restricts the
selection according to the range set in the Refine Selection window (a pop-up win-
dow invokes when this button is pressed). For instance, a user may want to
remove a data (x0) within a range of 3-4 from a cross-plot in which 1-2 values are
overlapping. To do that, press this button to launch Refine Selection window. In the
Enter Ranges field, an equation can be set i.e. X0 > 3 and X0 < 4. Where X0 is
desired data in a cross-plot. After this a section can be made within a cross-plot to
remove values within the polygon according to the equation.
- 611 -
- 612 -
5.3.3.1 Cross-Plot Properties
The properties window (accessed via )can be used to adjust the scale, view stat-
istics, add regressing lines etc.
Scaling Tab: Sets the clipping state for each axis, or the amplitude range for the
chart. Default: 0, which means that the window is adjusted to fit the entire amp-
litude distribution.
- 613 -
Statistics Tab: Shows the parameters of a least square fit between the attributes
used as X and Y1 (values and errors). The regression line can be displayed in the
cross-plot window, as well as the correlation coefficient.
- 614 -
User Defined Tab: Sets a user-defined regression line, that can be displayed as
well in the cross-plot window. A simple line can also be drawn in the cross-plot win-
dow, in which case the corresponding coefficients will be displayed in this tab.
- 615 -
Display Properties Tab: Sets user-defined marker size, marker shape and the Y-
axis color.
- 616 -
Density Plot Tab: This tab is used to set the minimum points for the automated
density plot. Scatter plots will not be allowed below that number. The tab settings
define the bin size prior to counting the number of occurrences.
- 617 -
The following pictures shows an example of density plot. Please note that an addi-
tional colorbar has appeared. Units are the number of points that correspond to the
color.
- 618 -
- 619 -
5.3.3.2 Probability Density Functions
Probability Density Functions (PDFs) can be created from the cross-plot toolbar
icon . This icon launches a pop-up dialog that can be used for selecting attrib-
utes in order to create PDFs. The PDFs are stored in OpendTect Format, that can
later be used for running Bayesian classifications.
Please note that all attributes from the table can be selected. Attribute ranges are
generated automatically to fit the extracted data distribution. These can be edited
before creating the PDF.
- 620 -
5.3.3.3 Overlay from a Third Attribute
Scattered points can be coded with respect to the amplitudes of an attribute using
that option. The popup window requires the selection of that third attribute, and col-
orbar specifications (type and amplitude range)
- 621 -
5.3.3.4 Well-based Color Coding
The scattered points in the cross-plot window can be coded with respect to the
wells along which the data points were originally extracted. The following utility
window can be used to control the colour associated to each well:
- 622 -
- 623 -
5.3.3.5 Selection Settings
The selections made interactively in the cross-plot window can be further refined
and managed in this window.
The 'Refine' option ( )utilizes mathematical logic to restrict the selection accord-
ing to the range set in the Refine Selection window (a pop-up window invoked
when this button is pressed). For instance, a user may want to remove a data (x0)
within a range of 3-4 from a cross-plot in which 1-2 values are overlapping. To do
that, press this button to launch Refine Selection window. In the Enter Ranges
field, an equation can be set i.e. X0 > 3 and X0 < 4. Where X0 is desired data in a
cross-plot. After this a section can be made within a cross-plot to remove values
within the polygon according to the equation.
- 624 -
The 'Manage Selection' option ( ) can be used to do multiple selections by
adding new groups (see below). It is launched using this manage selection button.
The multiple group selection allows you to select different clusters/trends on a
crossplot in the form of groups. Second and subsequent selections are made by
clicking on a group name and holding the CTRL key down from a keyboard prior.
Then the corresponding polygon (with a given colour) is drawn over the crossplot
display area. It is a very useful tool for reservoir prediction and characterization.
The selected scattered points can then be displayed in the active scene by clicking
on OK. This allows an interactive display of the cross-plot in a scene. The dis-
played points (i.e. picks) can be saved in the OpendTect survey either as a point-
set or as a Body. Right-click on any point in the display, it will launch a pop-up
menu (see below).
- 625 -
You can do multiple selections by adding new groups (see below). It is launched
using this manage selection button. The multiple group selection allows you to
select different clusters/trends on a crossplot in the form of groups. Second and
subsequent selections are made by clicking on a group name and holding the
CTRL key down from a keyboard prior. Then the corresponding polygon (with a
given colour) is drawn over the crossplot display area. It is a very useful tool for
reservoir prediction and characterization.
In the column name, a new name should be given that would be added in the cross-
plot data table. Show selectedness as an overlay, if checked, would display the col-
our coded selection ranging (between 0-1) as an overlay in the crossplot area. The
colours represent the chosen colortable. Show selectedness in 3D scene would
display the points within the selectedness range in the 3D scene.
The user can manage the groups (or selections) by saving them (Save groups)
- 626 -
Furthermore, the user can also manage the groups (or selections) by opening
(Open groups) the saved groups.
- 627 -
The Selectedness is a special data selection. It is a measure of how likely a point
is to be selected in a particular selection. If a point is present in and around the cen-
ter of a particular selection that has higher selectedness values. Whereas those
belonging in the border regions are less likely to belong to that selection and thus it
will have lower selectedness values. It is a measure of which points are better rep-
resentative of a particular selection. The value of Selectedness ranges from 0-1.
The points outside the selection has undefined values. It is added as a separate
attribute in the table and can be seen in the form of an overlay attribute. To mark
selectedness, one group is needed to be selected by using the selection mode.
- 628 -
5.3.4 Open Crossplot
This option allows you to open previously saved or imported crossplot data and
will directly open the data in the crossplot table.
- 629 -
5.4 Wells
The Wells element in the Analysis menu gives you access to three features: Edit
logs...,Tie Well to Seismic... and Rock Physics...
- 630 -
5.4.1 Well Log Tools
The well log tools can be used to remove spikes, smooth and clip the logs. it can
also accessed by using the icon in the Manage Wells window. Multiple wells
can be selected at once along-with the various logs. The logs can be extracted
between:
Depth range: The logs can be directly extracted between a particular depth range:
- 631 -
Time range: The logs can be directly extracted between a time window. The
extraction may be done in time domain by toggling on 'Extract in time'.
On pressing Go the extracted logs are displayed and 'smoothing', 'clipping' and
'spike removal' can be performed on these well logs:
- 632 -
Smooothing: A window size (samples) should be defined in which the smoothing
of the well log data will be performed.
Remove spikes: De-spiking of the logs can be done by specifying a window size
(samples) and the Threshold for the Grubbs algorithm. Further, the removed spike
values can be replaced by 'Undefined values', 'Interpolated values' or can be manu-
ally specified.
Finally, the edited logs can be saved with an extension or can be overwritten.
- 633 -
5.4.2 Tie Well to Seismic
Launch the Well-Seismic Tie window from the main menu or, optionally, the Well-
Seismic Tie window can be launched from the tree.
Well to the seismic tie is a major task for interpretation projects. It is used to cor-
relate the well information (logs) to the 3D seismic volume or 2D seismic lines.
This enables the comparison (crossplots, ...) of well-based and the 2D/3D seismic
data.
1. Data preparation
l Import the seismic volume or 2D line.
l Extract a wavelet.
l Import the wells: Each well requires a track, checkshot or time-depth curve and
sonic log.
l Import the additional data: 3D/2D Horizons, well markers, additional time-depth
well.
l The input fields must be selected.
l Based on the available data the density and sonic logs will be combined into
- 634 -
validated, before being converted into a new time-depth function that replaces the
previous one. No changes are being applied to the well logs.
l At each step of the tie a deterministic wavelet can be estimated using the time-con-
verted reflectivity log and the composite seismic trace. This deterministic wavelet
should vary per well, and is known to link the well data to the seismic data more
reliably.
- 635 -
5.4.2.1 Well-Tie Selection Window
The tie well to seismic window is used to select the necessary data for the Well-
Seismic tie workflow. Please have a look at the introduction to see how to prepare
the necessary data.
Well tie can be used to tie the well with the 3D seismic volume or 2D seismic line.
- 636 -
Well to 2D seismic tie window
In both the 2D and 3D windows, there are additional features accessed via the fol-
lowing icons:
- 637 -
5.4.2.2 Well-Tie Checkshot Editor
In OpendTect, CheckShot corrections are applied before launching the Well to
Seismic Tie window. If you have no depth/time model or have not selected any
existing one, you will be proposed to correct the sonic integrated depth/time model,
provided you imported a CheckShot model for your well.
The choices given are "None", "Automatic" or "Use editor". In the first case, the
time depth curve will be computed directly from the sonic log without any correction
(note, this is also the default mode if you do not have any CheckShot). In the auto-
matic mode, the time/depth curve will be calibrated to the CheckShot without any
user interaction. In the last case, you will be allowed to edit the calibration yourself
using an editor window.
When clicking on 'Run', the 'Apply Checkshots correction' window pops up.
- 638 -
- 639 -
The above window is divided in two panels. On the left one, the sonic 'Integrated
Depth/Time Model' (red) and the 'Checkshot Curve' (blue) are plotted. The right
panel displays the drift curves. The original 'Drift Curve' (red) shows the variations
between the CheckShot and the sonic integrated model.
By adding points to the right display you can additionally generate a new 'User
defined Drift Curve' (green). This is done by clicking the icon. Once this is done,
select the correction to apply, either from the Original or from the User Drift Curve
and push the Apply button. The newly computed depth/time model will appear in
green on the left panel. You can modify the drift curve and re-apply the corrections
until you are satisfied with the depth/time model. Push the OK button and the main
well tie window will appear using the new calibrated depth/time model.
- 640 -
5.4.2.3 Well-Tie Display Panel
The display panel is the main window where the wells are tied to the seismic data.
This module is primarly used to update the current (loaded) time-depth curve. Pre-
vious, intermediate and final TD curves can be exported to ascii at any moment
using the following icon .
The raw logs are shown before upscaling. The vertical axis of all 3 frames is in
travel time.
Key points:
l The time-depth conversion and synthetic seismic traces computation is done using
the current time-depth curve and checkshot (if available).
l The checkshot data acts as a strong constraint, i.e. any input and output time-depth
curve will be forced to honour the checkshot.
l The time of the depth TVDSD=0 will always remain at TWT=0 even when applying a
bulk shift: The difference is absorbed between the point TVDSS=0 and the first sonic
log sample. The reference datum elevations definitions are summarized in the well
track import chapter.
- 641 -
Display panel for the well-seismic tie.
At the bottom right corner of this display panel, there are several tracking controls.
The options are used to pick an event to match the seismic and synthetic traces.
After picking the event, press Apply Changes to reflect the changes and update the
time to depth model.
Launches the Edit Time/Depth model window. In the pop-up window, press
Export button to export the time-depth model as an ASCII file.
- 642 -
Or save the synthetic trace as a seismic volume.
- 643 -
Takes the snapshot of the display panel
Display settings/properties for the panel. The settings are similar to that of the
normal 2D-viewer model
Zoom-in
Zoom-out
Pick seeds on the seismic or synthetic to update the time to depth model.
- 644 -
Change in depth/time model
l Choose a tracking mode (e.g. maxima, minima, zero-crossings etc.) and select events
in the synthetics/seismic displays by first selecting the Pick mode button. Events can
not be picked separately. Each event in the synthetics must be linked with an other
event in the seismic.
l Once all the events are selected on both synthetics/seismic displays, press Apply
changes button. The depth/time model and the whole computational workflow will be
recomputed. If needed, repeat the operation.
l The Display additional information button will open the Cross-checking parameters
window, and provide useful cross-checking tools, such as correlation coefficient and
estimated wavelet in a specific depth range. The estimated wavelet displayed here
can also be saved:
l The depth/time table can be saved between each state by pushing the Save button, in
the toolbar to the right of the synthetics/seismic displays, and saving with an appro-
priate name. The View/Edit Model button allows the user to import a depth/time table.
l Once a good correlation has been established, click on Ok/Save and save the depth/-
time model.
- 645 -
5.4.2.4 Well-Tie Crosscheck Parameters
The cross-checking window is launched from the well ties display panel (Section-
Well Tie: Display Panel) by clicking on the Display additional information button.
The window contains the initial and estimated wavelet information. The wavelet
can be computed between the two levels (start-end of data in time/depth or mark-
ers) that are provided at the top of this window. The window contains further key
information: wavelets plot and the correlation coefficient. By changing the compute
data between option, the correlation coefficient is auto-updated, this is done by
using either Markers (Default) or Time/Depth. After achieving a high and positive
correlation coefficient, the estimated wavelet can be saved. Importantly, the neg-
ative correlation coefficient shows that the polarity of the estimated wavelet is
reversed. To avoid that the reference/initial wavelet's polarity has to be reversed.
- 646 -
To save the estimated wavelet, press the icon in the well tie main window .
The wavelet properties ( ) can be shown as a graphical display of the wavelet, its
amplitude spectrum and phase spectrum.
- 647 -
- 648 -
Save the estimated wavelet
The user can also click on to open a slider interface for shifting phase of the
wavelet:
Rotate Wavelet: Using the slider, the user can edit the phase of a wavelet.
Taper Wavelet: This option enables the user to taper a wavelet by clicking on .
The User will see the real-time changes in the amplitude spectrum.
- 649 -
- 650 -
5.5 Layer Modeling
Pseudo-wells are stratigraphic columns with attached well logs, but without geo-
graphical location. Any pseudo-well can be seen as a possible realization of a
newly drilled well in the area. The pseudo-wells generation is achieved following a
model that has to be defined. To achieve Layer Modeling, preliminary an extended
well data analysis has to be carried out. The stratigraphy must be defined and then
the well logs behavior have to be known in order to be used in the mod-
eling.During the modeling process, the stratigraphy description is fixed and cannot
be edited.
The Layer Modeling is accessible from the Analysis menu but also from the Man-
age Stratigraphy window.
Basic modeling can be achieved in OpendTect. More advanced modeling are avail-
able in SynthRock plugin.
1. Model definition: using the stratigraphy description, properties are assigned to the dif-
ferent lithologies within each units. These properties are fixed or can vary. The model
definition is used to generate the pseudo-wells.
2. Synthetic and Log generation: the pseudo-wells are generated and their associated
properties can be displayed. With a wavelet extracted from the real seismic, zero-off-
set synthetics are generated. Using a ray tracer synthetics can be computed for dif-
- 651 -
ferent offsets and restricted angle stacks can be created. Thus their behavior with vary-
ing offset can be analyzed.
3. Pseudo-well data analysis: the properties from modeled logs and synthetic seismic
can be compared and analyzed layer by layer, lithology by lithology.
- 652 -
5.5.1 Basic
Each layer in the stratigraphy column is characterized by different rock properties.
The model, based on the stratigraphy, is assigning properties to each lithology,
layer by layer. The model is built using a blocky approach. The different properties
are selected within a list. Their value can either be constant or vary within a given
range.
- 653 -
5.5.1.1 Layer Description
First of all, the Layer Succession has to be defined and will be used to create the
pseudo-wells.
To start defining layers properties, the user has to click on "click to add" on the left
rectangle. Once one layer has been defined, click on Ok and the layer appears in
the left rectangle. To add a layer, click-right on the rectangle and select "Add
above/below": you can then define a new layer. "Edit Layer" is also accessible
from this right-click menu. The description can be saved in clicking on the icon
and later be accessed in clicking on the icon.
1. First of all the properties to be defined for the modeling have to be selected within a
list. The properties in the list have been defined in the Layer properties Manager
which is accessible from the icon and can be edited.
- 654 -
The selection can always be edited in clicking on the icon . To be able to gen-
erate synthetics, Density and Pwave velocity are selected by default. For the
moment, it is not possible to combine properties together. So for example to get the
Acoustic Impedance, you have to model the Acoustic Impedance log.
2. The Layers have been defined in the Stratigraphy Manager. To each lithology of each
layer are assigned properties and if within the survey this property is expected to
remain constant or to vary within a given range. The thickness of each layer can also
stay constant or be varied. The variation is linear.
- 655 -
The thickness is a default property. When defining a thickness range, the starting
thickness can be set to a negative value: it will appear as a trunctation in the
pseudo-wells.
The fluid content can also be specified.It had to be specified previously in the Con-
tent manager.
When clicking on OK, the Layer Description will appear on the left rectangle. To
edit the properties of one or more layer, just click on it.
- 656 -
5.5.1.2 Synthetic- and Property-Log Generation
The property logs of the pseudo-wells only need the Layer Description to be cre-
ated. To generate synthetic traces for each pseudo-well, three elements have to be
provided:
l Layer properties: The density, P-wave and S-wave have to be given, computed or
Once the Layers are defined, the parameters to generate the pseudo-wells must be
defined (red rectangle)
The pseudo-wells are generated when clicking on Go, in the lower left side of the
window. The number of pseudo-wells to be generated is user-defined.
- 657 -
5.5.1.2.1 Synthetic Layer Properties
The synthetic seismic generation requires different quantities : Density, P wave
velocity and S wave velocity. These quantities can be specified in clicking on the
icon : they can be computed using formulas and the appropriate modeled quant-
ities. If the quantity has been modeled, it can be used as Defined quantity .
- 658 -
5.5.1.2.2 Wavelet
The wavelet can be selected from the one already available in the project and lis-
ted or a new one can be created in the Wavelet manager accessible from the icon
.
Some workflows need to have the synthetic with the same amplitude that the real
seismic. The purpose to the scaler is to scale the wavelet by comparison between
the synthetic seismic computed at a given horizon and the real seismic extracted in
a defined time window regarding this same horizon. To do so click on Scale.
- 659 -
First of all you need to select your reference seismic as Input Cube, then the ref-
erence horizon for the extraction of the real seismic data. The reference for the
extraction in the synthetics is the reference stratigraphic level selected in the main
window. The extraction must be done at a level interpreted in the pseudo-wells
- 660 -
and in the real seismic. It is possible to restrict the extraction to an area defined by
a polygon. Also the reference level in the pseudo-wells does not necessary cor-
respond to a specific event in all the wells, on the contrary horizons are most often
interpreted following a same event. Thus it is possible to snap the synthetics to a
specific event. Finally the extraction window around this reference level has to be
specified. It will depends on the thickness of the interval of interest of your data.
Once all these parameters have been given, you can extract values.
The histograms for the synthetics and the the real seismic are displayed side by
side to be easily compared. A same point is identified in the two cases and the dif-
ference between the two amplitude values is used to determine the scaling factor.
The scaled wavelet can then be saved and used afterwards.
- 661 -
5.5.1.2.3 Ray Tracing
The ray tracer, available via the icon, allows the creation of synthetics for dif-
ferent offsets and to perform different angle stacks. The source/receiver depths
have to be provided. The offset range has also to be specified. The arrival times
are calculated by ray-tracing through a horizontally layered isotropic earth model.
- 662 -
Ray Tracing parameters in the simple and advanced mode
In the advanced mode, the surface coefficient can also be defined if known as well
as the spreading geometry.
- 663 -
When pressing Go , the synthetics for different offsets are computed. The view is
set to be Free view by default, you are then able to display a single offset or a lim-
ited offset stack in ticking the Stack option.
When one offset is displayed, it is possible to make the offset vary from a given
Step using the arrows .
From the icon , it is possible to display the gathers for the different models.
- 664 -
5.5.1.2.4 Display
There are several display options within the Layer Modelling feature:
By default the property logs are displayed regarding the block. When toggled on
the icon, the representation is one color per lithology. The property displayed is
selected in the selection menu in the lower part of the window.
When the icon is on, if the user zoom on the synthetics, it will not affect the prop-
erty logs view. The icon is on by default.
The stratigraphic level is a marker. The marker position has been modeled and so
its position within the pseudo-well can be displayed. In the real wells, markers
come from the log data and geological information. It does not necessarily cor-
respond to a given seismic event. On the synthetic from the pseudo wells, it is pos-
sible to snap a selected marker to a seismic event (point, trough, zero-crossing...).
This has to be done carefully as some information can be lost: there may be lateral
variations of the rock properties that may impact on the phase of the seismic.
- 665 -
- 666 -
- 667 -
The "Wiggle Variable area" section concerns the display of the synthetic log itself.
The "Variable density" section concerns the background, i.e the interpolation
between the synthetic traces.
- 668 -
5.5.1.2.5 Layer Properties
Once the simulation has been run, the pseudo wells will have been generated and
these well properties are then displayed in the lower section. The synthetics are
also generated and display in the upper half of the window. When clicking on a
given pseudo-well, a line appears to show the selected pseudo-well and right-click-
ing on a particular layer of this selected pseudo-well gives a menu to various
options.
For this selected pseudo-well, the Properties... option gives access to the char-
acteristics of this specific layer in term of thickness and modeled properties. In the
layer-based modeling (basic or stochastic), these values can be manually mod-
ified. The fluid content can also be edited. Changes are saved when clicking Ok
and the display is automatically updated.
- 669 -
In the SynthRock plugin, in the Profile mode, a similar window is available on right-
clicking on any trace on the lower rectangle where a selected property is displayed
for the different pseudo-wells. In selecting Inspect values, the window opens. In
this case it is an informative window: the different property values can not be
changed. The fluid content however can be edited. The lithology in this case is
unknown as it is based solely on well log(s).
- 670 -
5.5.1.2.6 Remove Layer
The existing layers of a model can be removed at any time, by right clicking on the
left hand side pane, containing the simulation information.
- 671 -
Thereafter, the regeneration of the pseudo-wells can be done by clicking on .
- 672 -
5.5.1.2.7 Export Synthetic Datasets
The synthetic seismic data (both poststack and prestack), the layer property syn-
thetics in Time (e.g. AI, Density etc.) and the stratigraphic levels/markers, from all
modeling modules (i.e. Basic, Profile and Stochastic) can be exported along 2D
lines. The stratigraphic levels/markers in the modeled pseudo-wells are essentially
exported as 2D horizons. This is achieved by clicking on the icon at the top
right of the modeling window (see below).
Export of the synthetics can be done onto an already existing 2D line or a new line
created on the fly.
If the 2D line is created on the fly, the Geometry for line has to be defined as well. It
can be done by defining a straight line between two X-Y coordinate pairs.
- 673 -
Creating a straight 2D line between two X-Y coordinate pairs
The 2D line can also be created, on the fly, along an existing polygon.
Finally, the 2D line can also be created along an existing random line.
Now, a selection on poststack data, 2D horizons and prestack data can be made
for exporting along the 2D line.
- 674 -
For poststack data, user can select synthetic seismic and various layer property
synthetics (e.g. Acoustic Impedance, Density etc.).
Similarly for 2D horizons, various levels present in the pseudo-wells can be selec-
ted.
- 675 -
2D horizon data selection for export
Optionally, a prefix and/or postfix can be specified for various data items. Pressing
Ok will export the selected data items along the 2D line.
- 676 -
5.5.1.3 Cross-Plots
The cross-plot tool in the Basic layer modeling can be started from the icon . It
allows to analyze seismic and layer attributes from the modeled data. On the main
window, the user select the attributes to be extracted and the extraction para-
meters.The extraction window is related to a reference level. Its length is user-
defined. The appropriate extraction window size has to be defined regarding the
interval of interest. The user has to provide a step that corresponds to the sample
rate within the extraction window.
- 677 -
Once the attributes and the extraction parameters defined, the crossplot window
opens and is similar to the one available for the classic seismic/well analysis.
- 678 -
5.5.1.3.1 Wavelet Scaling
If the wavelet has not been scaled to the real seismic, a pop up window will first
appear prior to access to the attribute selection window.
Once the wavelet is scaled or marked as scaled, it will be remembered and the win-
dow will not appear again.
- 679 -
5.5.1.3.2 Seismic Attributes
The seismic attribute selection/definition is comparable to the main attribute win-
dow: the same attributes are available. An attribute can be selected in the list of
Available types and add to the Defined attributes using the button. The para-
meters of the attribute have to be specified. Synthetic seismic generated from the
models can be used as input data. All the listed attributes are not necessary using
synthetic seismic.
- 680 -
5.5.1.3.3 Layer Attributes
For each modeled property, data can be extracted either along the log using a
defined extraction window or by layers:
Sliding: the property value is extracted within the extraction window where a cal-
culation window slides along the well. The size of the calculation window is
defined by the step provided by the user on the first crossplot window. The output
is the nearest sample, the average, the median, the RMS or the most frequent. At
the end the attribute has [number of pseudo-wells* round up (Extraction window
size/Step)] samples.
- 681 -
In both case, it is possible to transform the attribute value in applying the function
power, log or exponential. The attribute is extracted on each pseudo-well.
- 682 -
6 Processing
- 683 -
- 684 -
6.1.1 Attributes
In this module any (attribute) volume can be calculated and saved to disk. In case
of 2D attributes, the output is a new data set. The volume output module can be run
in batch mode, allowing to continue working in the main window while the pro-
cessing is running.
This module creates, for example, attribute cubes, neural network cubes, or filtered
data cubes.
- 685 -
6.1.1.1 2D - Create Output
A data set attribute can be created through the following procedure: Processing >
Create Seismic output > Attribute > 2D
First, Select the output quantity : it can be either a stored 2D volume or an attribute
from the active 2D attribute set.
Note that only attributes from the current attribute set can be selected in the "Select
quantity to output" window. To output an attribute from another attribute set, you
must select this attribute set in the attribute module.
Though the default is set to 'All Lines', a selection of lines can be specified by click-
ing 'Select':
- 686 -
Optionally, the output can be scaled with a Shift and a Factor:
- 687 -
- 688 -
6.1.1.2 3D - Create Output
To create a 3D seismic output from an attribute, follow the path Processing >
Create Seismic Output > Attribute > 3D.
Note that only attributes from the current attribute set can be selected in the 'Select
output quantity' window. To output an attribute from another attribute set, you must
select this attribute set in the attribute module.
- 689 -
And to set the job priority (if deemed necessary). This is equivalent to the 'Nice
level' on Linux, determining how much priority the process will take on the remote
machine:
Storage: OpendTect can store data internally in 8-, 16-, 32-, and 64-bit seismic
data formats. 8-bit signed has a data range between -127 and +127. 8-bit unsigned
ranges between 0 and 255. Similarly, 16-bit signed ranges between -32767 and
+32767 (unsigned 0 - 65535). The data is stored in the same byte-format as the
input by default (Storage is set to 0 - auto). This is chosen when specifying the
Format/Scaling.
- 690 -
Adjust Z range to survey range: this option adjusts the new scaled seismic to the
survey range.
Optimize horizontal slice access: For better performance when loading time
slices, set this option to 'Yes'. This compromises some speed in loading cross-
lines, but it loads time slices significantly faster.
- 691 -
6.1.1.3 Multi Attribute
Multi-attribute output enables the user to create a volume with several attributes.
'Available Attributes' lists all possible components of the possible multi-attribute
output and any combination can be selected. Once moved across into the 'Selec-
ted Attributes' list, these attributes are processed into a single volume.
It is not necessary to check the boxes next to the attributes in the Selected Attrib-
utes list in order for them to be included in the output volume. All attributes in the
'Selected' list are included by default.
- 692 -
Also note that a Spectral Decomposition attribute will show a list of all possible out-
puts in the 'Available attributes' field:
On loading the volume onto an inline (for example), the user is prompted to select
which of the available attributes he wishes to use:
The component of the multi-attribute cube that is actually being displayed in the
scene is appended on the volume name in the tree:
- 693 -
- 694 -
6.1.1.4 Multi-Cube Data Store
Multi-components cubes for some attributes (e.g Spectral Decomposition, Steering
attributes) can be created.
Create the volume output: Go to Processing > Create Seismic Output > Attribute >
Multi-cube data store and select a volume that contains multiple components (here,
Spectral Decomposition as example):
Select a component of the cube and assign it a 'pseudo-offset' value. Repeat this
process of pseudo-offset assignment for all of the components that you wish to be
present in the output, name it and press 'Go'.
- 695 -
Multi-component cubes can be exported as SEG-Y or simple ASCII file but only
one component per output. The choice of which component is given to the user dur-
ing the export process.
- 696 -
How to display the Multi Component Volumes?
Once several components are selected and displayed for an element (e.g. an
inline), place the mouse in the scene, and use the keyboard's PAGE-UP/PAGE-
DOWN keys to view the next/previous slice in real-time.
- 697 -
6.1.1.5 Along Horizon
To create a seismic output in a time interval relative to a single horizon, the quant-
ity to output has first to be selected from the list of stored data or attributes from the
current attribute set. Specify the horizon and the Z interval relative to this horizon. A
sub-area can be specified.
The Value outside the computed area is the undefined value. The standard
undefined value in OpendTect is 1e30, but any other value can be specified.
Optionally, the horizons can be interpolated. The interpolation can be full or partial.
- 698 -
6.1.1.6 Between Horizons
To create a seismic output between two horizons, first the quantity to output has to
be selected from the list of stored data or attributes. Specify the horizons that form
the upper and lower boundaries of the output volume. A Z shift to be specified can
be applied to the upper boundary and/or to the lower boundary. A sub-area can be
specified.
The Value outside the computed area is the undefined value. The standard
undefined value in OpendTect is 1e30 or Undef, but any value can be specified.
- 699 -
In Extra options, horizons can be interpolated. The interpolation can be full or par-
tial. When partial, only the gaps smaller than a user-defined number of traces are
filled. A fixed Z interval length can be added to the leading horizon when the
second surface is missing or in case of conflict during the interpolation. To con-
straint the interpolation, the Z limits for the output cube can also be specified.
- 700 -
6.1.2 2D to 3D Conversions
There are several ways to convert data between 2D and 3D:
- 701 -
6.1.2.1 Create 2D Grid
The Create 2D from 3D option is an interactive tool for creating 2D-lattices from a
3D volumes.
This option can be used to create a 2D grid with a fixed grid spacing. When selec-
ted, the Create 2D Grid window is launched (see below). Here, specify the input
3D seismic volume and the output data set name. The output grid is generated
according to the dip (parallel) and strike (perpendicular) direction of the selected
volume. The prefix labels are used as prefixes to the output line names, stored to
the specified new data set name. The grid spacing is the constant spacing
between the two lines. At the bottom, the total number of parallel and perpendicular
lines will be updated according to the grid spacing. By pressing Ok, a batch pro-
cess will start to generate the 2D grid. When the batch program is finished, the
lines and data can be displayed in the scene
- 702 -
- 703 -
Another option is to define the 2D grid geometry based on a Random Line. The
grid can be created both parallel- and perpendicular to the direction of the random
line. Set the fixed spacing in the line spacing(m) fields.
- 704 -
6.1.2.2 Extract 2D from 3D
Extract 3D data onto selected 2D lines. Input data is required in the form of a stored
3D volume. One or more 2D lines can be selected for the 3D data to be extracted
onto. The output data set requires naming:
If just one line is selected, you may also sub-select a trace range.
- 705 -
6.1.3 Angle Mute Function
This module creates angle-based Z-Offset functions from velocity volumes. The
primary input is a velocity model that provides the time-depth relation.
If you have no velocity volume available, click 'Create'. This will bring up a window
in which you can select a volume and tag it with a velocity type:
- 706 -
Vertical 1D ray-tracing is performed assuming a fully flat, isotropic earth model.
The travel-time 0ms corresponds to the depth of the Seismic Reference Datum,
defined in the survey definition window.
The offset range must be provided since the angle mute computer is not aware of
any prestack datastore. It does not necessarily need to match the prestack data.
The output function will have one point at the start and stop of the Z range, and one
point at each offset specified by the offset range parameters.
The main output parameter is the incidence angle in degrees at which the mute
function must be computed. By default the functions will be computed on a relaxed
grid, every 20th time the inline/crossline stepout. This can be changed by selecting
other "Volume subselection" parameters. In general it is not necessary to decrease
that stepout.
- 707 -
6.1.3.1 Advanced Angle Mute Paramaters
The ray-tracing can be performed in two ways:
Simple: The ray is going directly from the source to the depth of the target layer,
and up to the receiver in the same way. This does not account for ray bending, or
velocity inversions.
Advanced: Will honour the ray bending according to Snell's law and thus velocity
inversions as well. To reduce the processing time, the layers may be blocked: Con-
secutive layers with similar Vp, (and density, Vs if present) values are con-
catenated together. The ray is propagated in a straight line inside a concatenated
layer.
- 708 -
- 709 -
6.1.4 Bayesian Classification
Bayesian classifications are used to link several attributes based on one or several
Probability Density Functions.
First, one or several PDF(s) need to be provided. More will allow to select more
PDFs.
After clicking on Next >>, the PDF(s) can then optionally be normalized based on a
priori weights per PDF. The a priori weights can be provided by attribute volumes
which will vary at every sample location.
- 710 -
The processing will be based on (inverted) stored volumes that should correspond
to the variables used for generating the PDF. Please note that OpendTect cannot
make this check.
If only one PDF was used as input, the Bayesian classification volume will gen-
erate two volumes:
- 711 -
If more than one PDF were used as input, the Bayesian classification volume will
generate:
- 712 -
6.1.5 Create from Well Logs
The Create seismic output from wells option writes loaded logs as seismic
volumes.
The dialog is accessed from the Processing menu in the main toolbar; choose the
Create from Wells.. option.
Select the well(s) in the left and the log(s) in the right list, note that several wells
and logs can be selected at once.
The log extraction can be done either between markers or by selecting a depth
range . For the latter, select start/stop depths in meters. For extraction between
- 713 -
markers select the wanted markers and add distance above/below (optional) for
including intervals above and/or below the marker depths.
- 714 -
The logs need to be resampled in order to display correctly on seismic sections
and volumes. Choose a resampling method from the dropdown list:
Next, choose how many traces should be duplicated around the well track. Essen-
tially this will determine the dimension and geometry of the output cube. Finally
give a name (suffix) to the CBVS volume (seismic volume), the volume itself will
automatically be named according to logs selected in the list.
- 715 -
6.1.6 Pre-Stack Processing
Prestack processing can be applied using different methods. It is the only place
where prestack data can be output in OpendTect based on another prestack data-
store.
Open the Pre Stack processing window: Processing > Create Seismic Output >
Pre-Stack Processing
The processing can be done by a number of sequential steps. Either select a pre-
defined set up (as above, which may be further 'Edit-ed') or press 'Create'...
- 716 -
Which brings you to the following window...
- 717 -
The following sections describe the available different steps.
- 718 -
6.1.6.1 Mute
Mute functions may be applied to Prestack gathers. This window will allow you to
choose the mute definition, as well as to specify settings such as:
- 719 -
Taper length (in samples)
- 720 -
6.1.6.2 Automatic Gain Control
Automatic Gain Control (AGC) is one of the processing methods available for
Prestack gathers. It will adjust the amplitude level using a sliding window of user-
defined size (window width). Optionally, part of the lowest energy may be dis-
carded from the amplitude level computation (in percent of the amplitude dis-
tribution).
- 721 -
6.1.6.3 Super Gather
A Super Gather may be used to laterally stack the traces in order to increase sig-
nal-to-noise ratio of Prestack gathers. The stack is controlled by an inline/crossline
stepout, and the Shape (Cross or Rectangle). The computation is similar to a (non-
steered) volume statistics attribute with a zero time-gate.
- 722 -
6.1.6.4 Angle Mute
This processing method computes and applies a mute function. See the pro-
cessing method documentation. The only difference is that this method reads the
offset range from the input prestack datastore, such that there is need to specify it.
See here where and when the input velocity source needs to be edited.
The application of the computed mute function is strictly identical to the application
of a stored mute function.
Simple: The ray is going directly from the source to the depth of the target layer,
and up to the receiver in the same way. This does not account for ray bending, or
velocity inversions.
- 723 -
Advanced (not in the GPL version): Will honour the ray bending according to
Snell's law and thus velocity inversions as well. To reduce the processing time, the
layers may be blocked: Consecutive layers with similar Vp, (and density, Vs if
present) values are concatenated together. The ray is propagated in a straight line
inside a concatenated layer.
- 724 -
6.1.7 SEG-Y Scanned Re-Sort
The SEG-Y Scanned re-sort uses a scanned SEG-Y file and outputs it as a new
file and re-writes the file-header. This tool is useful in case information in the
header is poor or poorly sorted.
In the Type field select the type of volume, either Pre-Stack or 3D volume.
Next, select the scanned input file and optionally area sub-selection. Note that a
SEG-Y file must be scanned prior to resorting. Choose a name for the output file
and (optionally) restrict the number of inlines to be written per file. In case the latter
option is used, multiple files will be written to disk, either using sequential numbers
or the inline ranges included in the separate files.
- 725 -
6.1.8 Velocity
Under 'velocity' sit two velocity-based conversion options:
- 726 -
6.1.8.1 Time-Depth Conversion
To create an time-depth converted output, follow: Processing > Create Seismic Out-
put > Time-Depth Conversion...
It is also possible to convert from Depth to Time. Instead of an input Time volume,
a input Depth model has to be provided.
- 727 -
6.1.8.2 Velocity Conversion
This tool is started from Processing > Create Seismic Output > Velocity
conversion. It can be used to convert interval velocity volumes to RMS velocity
volumes and vice versa. The conversion is applied using Dix's formula. Please
note that for this reason it can only be applied in the time domain.
- 728 -
6.1.9 Volume Builder Output
Volume-builder creator window can be launched from Processing > Create Seis-
mic Output > Volume Builder. It is used to create the output volume that has been
defined in a volume builder setup. Optionally, if the initial volume builder setup is
not defined, press the Edit button to define the setup. Press 'Ok' to launch a batch
processing window.
- 729 -
6.2 Create Horizon Output
This menu is used to create 2D/3D-grid based output. The data is stored as hori-
zon data (or attribute) to the selected horizon.
- 730 -
6.2.1 Attribute
Some attributes consume significant calculation time e.g. curvature, spectral
decomposition etc. It also depends on the size of the input seismic volume. There-
fore, to create on-the-fly a horizon data in a scene may take significant time. Fol-
lowing Processing > Create Horizon output > Attribute (2D or 3D), the attribute
calculation is there processed in the background (batch processing) and thus other
tasks can be done at the same time. By using a Horizon output, horizon attributes
can be created at desired horizons independently.
Select the quantity to output (2D/3D) from the list of stored data or attributes. The
attribute is (by default) saved with its own name, but it can be edited. Select the
horizon on which the selected attribute will be calculated.
The parameter file, which is automatically created in the Store Processing Spe-
cification field, can have any name (a default name is provided) so that the cal-
culation process can be easily re-started if needed. The batch processing can be
achieved using a single or multiple machines. More information on this window is
provided in single machine batch processing section. After the batch processing is
finished, the result will be available as a horizon data: right-click on a blank hori-
- 731 -
zon's attribute in the tree to select the horizon data (Select Attribute > Horizon
Data).
If the option for 'Fill undefined parts' is toggled on, then the 'Settings' button can be
used to enter the interpolation settings:
For the 'Execution Options', please refer to the following topic: Batch Execution
Parameters
- 732 -
6.2.2 Stratal Amplitude
Stratal Amplitude is a processing tool available to compute statistics (min, max,
rms, etc.) from an attribute along a horizon or between two horizons. The window
can be launched from Processing > Create Horizon output > Stratal Amplitude.
The output will be stored as horizon data (grid) saved on the top or base horizon.
This feature operates based on a single-trace calculation only (ie: no step-out). For
multi-trace calculations (using step-out), you are advised to use the Volume Stat-
istics attribute.
- 733 -
In this window, select the input attribute from which the values will be extracted.
The extraction may be guided by a single horizon (Single Horizon) or between two
horizons (Double Horizon).
l Single Horizon is used to extract amplitude along a horizon within fixed window rel-
ative to the point.
l Double Horizon is an option of amplitude extraction between two horizons. In this
case, the Z-offset parameters (see below) may be defined to increase or decrease the
area defined by the horizons.
l Z-offset is the offset window specification above (negative values) or below (positive
values) a horizon to restrict or extend the calculation interval. For example:
Settings to extract the average amplitude between 16 and 24 ms above ('-' values)
the selected horizon.
- 734 -
Settings to extract the average amplitude between 8 and 32 ms below (for positive
values there is no need to prefix with a '+' sign) the selected horizon.
- 735 -
Settings to extract the average amplitude 8ms equally around the selected hori-
zon.
l Area subselection is used to specify the area within which the attribute is output.
l Amplitude options are the available statistics for amplitude extraction. Five amp-
litude statistics are available: Min, Max, Average or RMS and Sum.
l Output fold as an extra attribute optionally outputs data fold, i.e the number of point
used for the processing, as separate horizon data.
- 736 -
6.2.3 Isopach
In OpendTect isopach maps can be quickly calculated. The Create isopach win-
dow is launched either from the Processing > Create Horizon Output > Isopach or
from the right click menu of any horizon loaded in the tree: Workflows > Create
Isopach. In this window, select two horizons between which the isopach has to be
computed. The isopach map will be saved as a horizon data of the first selected
horizon in the window or the horizon .
- 737 -
6.3 (Re-)Start Batch Job
Batch jobs in OpendTect are stored under a job name in a file containing the
inputs, parameters, log file and other relevant information. This information can be
read by clicking on the 'Information' icon, :
- 738 -
Jobs may also be removed using the icon.
- 739 -
'Execute remote' toggled on to send the job to a remote machine. The job priority
can be changed (-19-lowest to +19 highest).
- 740 -
If a job is selected that was created in OpendTect prior to the 5.0 upgrade, a warn-
ing will pop-up, stating "Pre 5.0 Job". These jobs can not be (re-) processed.
Attempting to do so will bring up the Error message: "Can not run selected job".
- 741 -
6.4 GMT
GMT (Generic Mapping Tools) is an open source collection of more than 60 tools
for manipulating Geographic and Cartesian data sets. It can produce Encapsulated
Postscript File illustrations ranging from simple x-y plots, via contour maps, to arti-
ficially illuminated surfaces and 3D perspectives views. OpendTect supports an
open source plugin that uses GMT tools to create scaled maps.
- 742 -
6.4.1 Initial Setup
To launch GMT tools, click on the icon in the OpendTect main toolbar. The first
time you launch the GMT mapping tools, a warning message will pop up, if GMT is
not already installed on your computer. This can be downloaded from the GMT
website.
After successful installation of the package, the GMT user interface will be
launched:
- 743 -
6.4.2 Create Postscript Maps
Several tabs have been arranged to specify the respective settings. The later part
of this section shows a typical OpendTect example of a postscript map.
l Basemap: This tab is used to set the scale of the map and other map settings.
l Contours: It is used to create a horizon contour map.
l Faults: It is used to post the intersection of faults with constant times or the inter-
section with a surface.
l Wells: It is used to post wells in the map.
l Locations: It is used to post pointset data in the map overlay.
l Polyline: It is used to add polygons (e.g. lease boundaries) in the map overlay.
l Random Lines: It is used to post Random Line(s) in the map.
l 2D Lines: It is used to post 2D-Line(s) in the map.
l Coastline: It is generally used to draw coastal lines.
l Clipping: It is used to set up polygonal clip paths.
l Advanced: It is used to use customized GMT commands.
For all the sections it is possible to Reset the parameters and thus go back to the
default ones. For all the section (except Basemap), Add will add the defined object
to the map overlays and Replace will update it if the object has been previously
defined.
In the Map overlays are listed all the elements that have been defined to be dis-
played on the final Basemap. You can modify the order in using the icons or
remove an object using the icon. The map will be created only when clicking on
Create Map.
- 744 -
Basemap settings
The basemap tab is filled with default parameters including the X/Y range from the
Survey setupSurvey setup. You can go back at any point to the default X/Y range
in clicking on Reset to Survey.
The map can be renamed. The scale can be modified. Scale, map width and
height are linked : any change of the scale, map width or height will affect the other
two parameters.
The label interval can be also be modified. The grid lines can be shown if you
toggle on Draw Gridlines.Optionally you can also add Remarks.
Once the different parameters defined, give an appropriate name to the output file
and specify the disk location and press Create Map button. View Map will display
the map.
- 745 -
Create a Contour Map
In Contour Map tab, first of all, select the horizon on which you want to create con-
tours. The different parameters are then filled by default. It is possible to edit the
value range and/or the number of contours. This will change the step. If you modify
the step, it will automatically change the number of contours.
It is possible to change the display parameters. The contours can appear as simple
contour lines or the space between the contours can be filled using a selected col-
ourbar.
Once the parameters all defined, press Add button: the selected 2D data set(s) will
appear on left Map overlays panel.
'Attribute' allows the user to select either Z-values (default option) or any of the Hori-
zon Data saved to this horizon.
- 746 -
Insert faults
- 747 -
3. Press Add button... the selected Wells will appear on left Map overlays panel.
Insert locations
Create a Polyline
- 748 -
1. Select Polygon
2. Give a Name to the Polyline
3. Optionally, edit the settings (symbols, size, color etc)
4. Press Add button... the selected Polygon will appear on left Map overlays panel.
Insert 2D lines
- 749 -
In 2D Lines tab, specify:
1. Select 2D line(s)
2. Name the line(s) (group).
3. Edit the settings (symbols, size, color etc)
4. Press Add button... the selected 2D line(s) group name will appear on left Map over-
lays panel.
Insert coastline
- 750 -
Clipping
The final map will be restricted to the inside or outside of a given polygon.
- 751 -
Typical output:
- 752 -
6.5 Madagascar
A generic user interface exists to Madagascar, an open source seismic processing
package that is very popular in seismic R&D communities. In the builder, seismic
pre- and poststack input and output files are either OpendTect or Madagascar
formatted. The processing flow is constructed as a sequence of Madagascar pro-
grams, using their parameters. These programs are selected from a list of available
programs (presently over 300), with a search field included to guide the user.
1. First Madagascar must be installed in order to use this interface between OpendTect
and Madagascar.
2. It's not possible to view Madagascar plots directly from the OpendTect user interface
on Windows. If the user wants to see the plot, she/he has to make her/his own
arrangements like starting the xserver etc ...
- 753 -
6.5.1 Madagascar Installation
Madagascar is an open-source, standalone software. To be used with OpendTect,
Madagascar must first be installed, otherwise, when starting Madagscar , the next
window will display an error message and missing program boxes.
The Madagascar package needs to be installed (see install) and the RSFROOT
variable has to be set to the installation directory. In order to get the full UI, ensure
that the text doc is installed. This can be done with:
$RSFROOT/bin/sfdoc -t $RSFROOT/doc/txt
On Windows, Please ensure the following to be able to use the Madagascar link
in OpendTect:
1. In Advanced System Settings -> Environment Variables, the variable RSFROOT must
be set to the Madagascar installation folder. Setting this variable only in the Cygwin
environment is not enough.
2. The variable PATH must include the Cygwin bin folder (e.g. C:\cygwin\bin).
- 754 -
6.5.2 Madagascar Processing Window
The Madagascar processing window can be launched from the OpendTect toolbar
by pressing the Madagascar icon.
Select the input cube to be processed, and then choose a program or combination
of programs. Programs are organized into groups of programs. Once one program
is selected, a description of program's functions are shown in the neighboring
frame.
- 755 -
The different steps, as well as a synopsis, of the computation are provided. The
descriptions of each program are available on the Madagascar website.
- 756 -
6.5.3 Toolbar
The toolbar is composed of the file option and three shortcut items.
- 757 -
6.5.4 Processing Input
The first step is to select an input cube.
- 758 -
6.5.5 Madagascar Processing Output
The final step is to chose an output volume type.
- 759 -
6.6 Batch Processing
Though Batch Processing does not actually appear in the pop-out menu under Pro-
cessing , this is a convenient place in this User Documentation to present the
information on these processes.
- 760 -
6.6.1 Single Machine Batch Processing Win-
dow
In single mode processing, the data can be processed either on a local machine or
on a remote host. All relevant information on the progress of the calculation will be
stored, (see Job information file).
'Execute remote' toggled on to send the job to a remote machine. The job priority
can be changed (-19-lowest to +19 highest).
- 761 -
6.6.2 Multi-Machine Batch Processing Window
The multi- machine batch processing window controls on which machines a
volume output or SteeringCube batch job will be processed. Jobs are distributed
over the Used hosts on a line-by-line basis (the number of inlines per job can be
specified). Hosts can be Added and/or Stopped at all times. Processed results are
stored in a Temporary storage directory.
At the end of the processing sequence, OpendTect will merge all processed lines
and store the data in the output file that was specified in the Volume output or
Create SteeringCube window, and it will delete the temporary disk files. If for any
reason OpendTect fails to perform this data transfer, this can also be done manu-
ally in the File - Manage module. The temporary data store appears with a name
starting with Proc_. Select this item and copy it to a new cube.
It is possible that at the end of a multi-machine batch job not all data was pro-
cessed successfully. Some jobs may have failed (e.g. because one of the hosts
was temporarily not available). OpendTect will come back with a warning message
stating which jobs (i.e. which inlines) have not been processed yet. It is then
advised to re-submit these jobs until all data are processed. The Auto-fill option
automatically scans and fills gaps in the processed volume.
The Nice level sets the priority the process gets. With the nice level set on 19 for
exemple the process has very low priority and other processes that run on the
same computer get priority. If the nice level is set to 1 the process gets the highest
priority.
The Processes menu allows to set the Run, Pause, or Go - Only between options.
The Go - Only between option, pauses and runs the processes at user-defined
times.
OpendTect calls the system utilities of the 'hostent' (sethostent, gethostent, etc.)
type to get a table of hosts that can be selected. How the Operating System builds
the lists is dependent on the particular system setup; most likely /etc/hosts and/or
the NIS tables are consulted. OpendTect supports multi-threading which means
that all processors of multi-processor machines will be used.
- 762 -
Though we support multi-threading, not all calculations can be run this way due to
some of the algorithms involved (ie: recursive calculations). See the following
table:
- 763 -
Multi-machine processing on Windows OS
The new system works with a Daemon Service running in background on every
remote machine to be used for processing. The communication works with TCP/IP
and requires some configurations to actually make things working.
OpendTect installation: You need to have OpendTect installed on all hosts, and
make sure they all use the same survey at the same time (have access to). For
example if B is using F3_Demo and want to process something in F3_Demo. then
it has to be made sure that the rest of the two PCs also use the same folder as long
as the processing is needed.
BatchHosts file: Add the IP addresses of C and D in the BatchHosts file inside the
application data folder.
Start the daemon: If launching process from B to the other two, then B is his local
machine, C and D are remote machine. In this case the Daemon service ( odre-
moteservice ) application has to be launched from binwin folder ( win32/win64 )
only in the remote machines and not in the local machine ( B ). Please note the
odremoteservice.exe not to be run directly instead a launching tool will be found in
win32mmod_ remote_ service_ manager. Use od_ remote_ service_ manager to
launch the daemon which will also add anotification icon to the system tray. Once
the service starts, the remote machines are ready.
Start processing: Select the PC's B, C and D from the list of machines in the mul-
timachine launch window and start processing:
- 764 -
Select machines to use for processing from the list
- 765 -
Multi-machine batch processing progress window
For more information, please refer to the OpendTect YouTube Channel for the
webinar on: Multi-Machine Processing Set-Up
- 766 -
6.6.3 Batch Log File
A batch log file is produced for every volume output run. The information is
streamed to a file if the batch job is executed on a remote computer. If the pro-
cessing is done locally the log file is either streamed to a new, dedicated window,
or to the standard output window. Every N traces the program will output a symbol
to reflect the progress. There are five symbols the program can use. Which symbol
it uses depends on the speed in number of traces times N per sec (given towards
the end of a line in brackets after the percentage of traces processed) and the
estimated remaining time until completion. The symbols indicate the following:
- 767 -
The following options are available:
- 768 -
6.6.4 Cluster Processing
Batch jobs can be run from OpendTect to cluster management tools. So far dGB
has successfully tested SLURM which is easy to install and even easier to use.
A new window will pop up that will list a number of directories use for the storage
of temporary files. The jobs will be split using a user-defined number of inlines.
The field named 'Cluster Processing command' represents the name of the binary
from the cluster management tool used to run a process.
You can run the "Main script file" (default: ~/yoursurvey/Proc/clusterprocscript) from
a command line which will run each job one-by-one using the above command
and will also launch the UI to show progress and do post-processing merging of
temporary data.
- 769 -
7 Scenes
The OpendTect main window can have multiple scenes, most of them opened
using this menu. The scenes behave like sub-windows within the main window:
Each scene has its own tree and can be minimized, maximized, reduced or
enlarged in size, without ever going out of the main window. The trees of different
scenes can be move on top of each other and sorted as tabs, or completely sep-
arated from the main window (they are utility windows).
The Cascade option will restore a default size for each scene and sort them start-
ing on the upper left corner of the main window.
The Tile option is a shortcut to maximize each scene by sharing the space of the
main window equally:
l Auto: The scenes are sorted automatically along the best fitting grid.
l Horizontal: The scenes are arranged along a single line.
l Vertical: The scenes are arranged along a single column.
- 770 -
If all scenes are maximized the active scene will be annotated on the left in the
Scenes menu. Clicking on another scene will make that one active.
- 771 -
7.1 Time- and Depth-Converted Scenes
OpendTect can display time data in the depth domain and depth data in the time
domain.
This is done using a user-selected velocity volume and computing the new Z
range (depth or time) based on the original Z range (time or depth respectively). In
all transformed scenes each and every display elements is re-positioned on-the-fly.
- 772 -
Pressing 'Create' will pop up a dialog that allows you to specify the velocity type for
a given volume:
The only exception is 3D volume for which the on-the-fly transformation can be
slow. Therefore time volumes can be depth converted (i.e. they become stored
volumes) using an additional option in the right-click option of inlines and cross-
line, in the transformed scene:
- 773 -
Please note that depth-stored volumes can also be imported via SEG-Y by settings
the appropriate tag in the SEG-Y import wizard.
- 774 -
7.2 Flattened Horizon Scenes
This option will generate a new scene flattened about the selected horizon. The Z
range has the same unit as the original scene, but it is now relative to that horizon
and no longer absolute.
- 775 -
7.3 Wheeler Scenes
The Wheeler Scene is a transformation (flattening) of HorizonCube into relative
geological time (RGT). Therefore, before adding a Wheeler Scene, the Hori-
zonCube will need to be selected. You will be prompted for this if not already selec-
ted.
- 776 -
8 View
8.1 Work Area
The Work area dialog is opened from the 'view' menu. The Work area sets the area
bounded by the survey box. Displayed items will be cropped automatically to fit the
set inline, cross-line and z-ranges.
Set ranges to full survey – to fully maximize the inline, crossline and Z
- 777 -
Save subselection
- 778 -
8.2 Z-Scale
The Z-scale option allows scaling of the survey box vertically.
There are three options for Z-scaling: The slice bar, setting scaling according to cur-
rent scene "Fit to scene" and resetting the scaling "To Home".
You can set any position/orientation of the scene by clicking on as default Home.
- 779 -
8.3 Stereo Viewing
The Stereo viewing menu allows setting the stereo viewing on/off. Note that in
order to use Red/Cyan stereo, appropriate glasses are needed. The offset between
the red and cyan view can be manipulated with the Stereo Offset menu. The Quad
buffer option has special hardware requirements in order to get passive stereo
view on a screen with dual and polarized projection.
- 780 -
8.4 Toolbars
All elements available in the main OpendTect window can be switched on/off here.
See Toolbars for the various actions of the buttons on the toolbars.
- 781 -
9 Utilities
9.1 Settings
The settings for Fonts, Mouse, Keyboard, etc., can be changed from Utilities--> Set-
tings-->
- 782 -
9.1.1 Look and feel
This option brings up an interface containing several tabs for defining various set-
tings in OpendTect:
General
- 783 -
Default Icon Size is 32. For systems with smaller screens (esp. laptops) it may be
useful to reduce this value to 28 or even 24. In combination with reduction of Font
size, this can prevent windows be 'oversized' for the screen.
- 784 -
Visualization
In the visualization tab , you may choose for OpenGL shading, also for volume ren-
dering, or switch off this option.You may also define the texture resolution factor to
one of these three settings:
Users facing data visualization issues may significantly improve their results by
turning off the shading and setting the resolution to Standard.
- 785 -
Fonts
Clicking on any of the listed buttons brings up a standard font definition window:
- 786 -
Processing
- 787 -
Please see the following sections for full details of Batch Processing and Cluster
Processing.
- 788 -
Horizons
Using this option, one may set both the default resolution and default colortable for
horizons. This is an especially helpful option for orientation in the early stages of a
project when many horizons are loaded.
As with many of the other settings, a restart is required to apply these defaults.
- 789 -
9.1.2 Mouse Controls
The mouse buttons can be set differently. System administrators can implement
user- defined mouse button actions. See the Application Management Docu-
mentation for more details.
- 790 -
9.1.3 Keyboard Shortcuts
The user can define his/her own keyboard shortcuts to move a slice for-
ward/backward. The user can use one key (set the first key to no-button) or a com-
bination of control or shift key, plus another key which can be selected from a long
list.
- 791 -
9.1.4 Advanced Settings
The ADVANCED user settings are used to change the default settings of specific
keywords. The user can specify his/her own settings and/or setup or edit the Sur-
vey defaults settings.
These settings can be found in the .od/ directory from the user home directory.
Depending on the platform used, this is located in:
To list the type of Personal settings and/or Survey defaults you have, please type
in linux console l ~/.od/*DTECT_USER*
- 792 -
9.1.4.1 Personal Settings
A user can update the Personal Settings by specifying a keyward (a variable
name) and its value. The value column becomes editable if left mouse button is
double clicked in a cell. A brief description on the available variables (and their val-
ues) is given below:
- 793 -
Most of the changes are updated after restarting the OpendTect, other are active dir-
ectly.
- 794 -
Pre-defined Keywords:
Press Select existing button to pop-up the pre-defined keywords. After selection of
any variable, the values should be updated. The values are updated by pressing
'OK' button.
l Company: Write the company name. Keyword - Company, Value: Text (e.g. dGB
Earth Sciences).
l Default DATA directory: It enables a user to setup a default OpendTect data dir-
ectory. The value is a path pointing to a data directory e.g. /disk/ODData/.
l Fonts: The fonts for the OpendTect controls and graphics are set by selecting the
Font.* or Graphic.* keywords. The wild card character (*) refers to the text followed by
the Font/Graphic word. All of these key words refer to a font dialog that may have a
specific value. For instance, a value is written as Times New Roman`12`Normal`No
[Font, Size, Style, Others]. The following keywords can have the similar values:
l Icons.size: It is a size of OpendTect icons. Its value can be a numeric number e.g. 20,
24, 32 etc.
l In-line byte: This will enable the user to setup the inline byte location
l Prestack Viewer Settings (3D PS Viewer.*) With value=1,the Prestack viewer will
change in autowidth mode
l Bold font: With Value set to 'Yes', the font is set to bold
l Cross-line byte: This will set up the default croosline byte location
l Display UnitID: This will enable the user to display the help window ID
l MultiMachine.Nr inline per job: A user may overrule the number of jobs (i.e. number
of inlines) for each machine while running multi machine batch processing. By default
the value is 3
l Nr Processors: It is a numeric number to specify the number of processors to be
used for a processing. For instance, '0' means to use all processors
l dTect.Measure LineStyle: This is a standard settings for the distance measurement
tool available in the Graphics Toolbar
l Seg-Y headers: Default file location to dump a SEG-Y headers. The value is the out-
put path together with a file name (e.g. /disk/SGY_dump.txt).
l SEG-Y.Examine.Number of traces: Set a minimum numeric value (e.g. 100) as a
default setting to examine any seg-y file
l Ui.ToolTips.enable: Refers to enable a tool tip guide. Set Yes/No in the value
l dTect.Ask.*: Set to prompt a warning message settings (Value: Yes/No) for several
windows. This include a prompt while closing OpendTect, closing Attribute Set win-
dow, storing pointsets, store session
- 795 -
l dTect.Auto Attribute set: Set an attribute name that will be auto loaded if the
OpendTect is launched
l dTect.Average even median:
l dTect.Color table.Name: A default color table for the user. The default is Red-White-
Blue. The value is a color table name
l dTect.ColorBar.show on top: Set a value to Yes/No to set the color table on top of
OpendTect or not
l dTect.ColorBar.show vertical: Set a value to Yes/No to set the default orientation of
the color table (vertical - Yes)
l 2D Viewer Settings (SeisView.*): 2D Viewer Display Properties of Wiggle Variable
Area
l dTect.DTECT_SHOW_HELP: This will allow the user to show ID on the help win-
dow icon
l dTect.Disp.Default clip perc: Set a default clipping value (1-100 %) to clip the seis-
mic data
l dTect.Default symmetry zero:This will set the colorbar symmetry around zero
l dTect.Dont load plugins: This enables the User to load or not the plugins
l dTect.Dump OI Menu: This is a special setting to export a OpendTect scene. Set the
value Yes/No to make this option available
l dTect.Icons.size: Set a default numeric value for OpendTect icon's size
l dTect.ProcessVolumeBuilderOnOK: This setting deals with Volume Builder setup.
By default the value is Yes i.e the setup process the volume based on the available
volume builder setup
l dTect.Show inl progress: Set Yes/No to show or hide the inline progress bar while
displaying the data
l dTect.Use VolRen shading: Set Yes/No for volume rendering shading
l dTect.Use shading: Set Yes/No for shading
l dTect.Zoom factor: Set a value for wheel zooming factor. Default value is 1
l dTect.KeyBindings.Default:
l dTect.Select Plugins: This feature allows the user to select or not the plugins
l dTect.VolRen shading: Set Yes/No for shading
l dTect.Use startup session: This option enables the user to directly start a session
when opening a specific survey in OpendTect
- 796 -
l dTect.Maximum Visualization Texture Size: Set the maximum size for the indi-
vidual visualization data 'chunks'. Default is 4096 (though this may be decreased to
2048, 1024, 512...)
- 797 -
9.1.4.2 Survey Defaults
This option enables the user to setup/edit the Survey default settings. The value
can be changed by double clicking in a cell.
- 798 -
Depending on the user, several settings are available. Let us describe few:
l 3DAttrset.Auto ID: This setting shows the ID value of the default auto-load attribute
set
l Default.Cube: This will show you the default cube ID. If there is no value, this will
mean that no Default Cube was set in that survey. To set a Cube as default, go to
"Manage Seismic Data" in the main OD interface then click on the 'set as default'
icon after a cube has been selected
l Default.2D Cube: This will show the ID value of the default 2D Data Set
l Depth in feet: Change the Value to 'No' if the Depth is in meter
l SEG-Y Rev.1 policy: When importing the SEG-Y,If the value is set to 1 the window
will pop up asking a user to confirm the Rev1. With the value = -1 no confirmation
pop-up window
l Show depth in feet: This option enables the user to show the depth in meter if the
value is set to 'No'. Setting the value to "Yes' will display the depth in feet
l Z Scale: This shows the value of the survey Z-scale
l Z Stretch: This setting shows the ID value of the stretch
- 799 -
9.2 Tools
- 800 -
9.2.1 Batch Programs
To run the batch program go to: Utilities > Tools > Batch programs
Choose the batch program you need to run, the available are: cbvs_browse, cbvs_
dump, lmhostid, glxinfo, ivfileviewer. The text box will show comments and details.
If another OpendTect batch program is chosen, fill in the required and (if needed)
the optional parameters (indicated by the square brackets "[ ]").
The batch program will start in a new xterm window. For example, if a batch pro-
gram is cbvs_browse like shown below, the cube to browse should be selected to
run a batch program.
- 801 -
9.2.2 Position Conversion
Position Conversion is an utility that can be used to convert the position pairs from
Inline/Crossline to X/Y, and vice versa. This utility can be launched either directly
from Utility > Position Conversion ... or from Survey selection menu (Survey >
Select/Setup).
In the position conversion window, there are two modes available for coordinate
conversion: Manual/File. In Manual mode, the user specifies an inline/crossline
pair (or X/Y pair), then press the corresponding arrow key to obtain the position in
the other domain.
In File mode, the user browses the input file and create a new output file. By spe-
cifying the corresponding type conversion (XY to IC or IC to XY) and pressing the
GO button, the desired conversion is written on output file.
- 802 -
- 803 -
9.2.3 Create Plugin Development Environment
With this option the user can create the OpendTect developer environment to
develop a plugin to OpendTect. The source code and all other relevant files are
copied into a user specified directory (chosen from the Package Manager during
the installation setup). More information can be found in the Programmer manual
from the Help menu.
- 804 -
9.2.4 Command Driver
The Command Driver offers automated control of the current OpendTect applic-
ation from a command script. The command script is a replacement for a series of
keyboard and mouse interactions performed by the user. It can be used to auto-
mate parts of the workflow, and helps to speed up executing repetitive tasks or giv-
ing automated demonstrations in OpendTect.
The Command Driver was created as a tool to make automated testing of the
OpendTect releases possible. That means it is not optimized for usage as a script-
ing tool. It is clear, however, that power users have been starting to use the Com-
mand Driver tool for this purpose.
The list of available commands, their syntax, and semantics can be found in the
Command Driver Manual.
- 805 -
9.2.5 Presentation Maker Plugin
Introduction
Python-pptx is a Python library for creating and updating PowerPoint (.pptx files).
The OpendTect 'Presentation Maker' plugin uses this library to create a Power-
Point presentation from scene, window or desktop screenshots.
Python-pptx installation
Windows
Preparation
On Windows, three packages need to be installed. First of all Python itself, then
lxml and finally Python-pptx. The three packages have to be installed in the given
order and installation instructions are given below. Before you start, create a new
folder to store the Python installation, eg C:\apps.
Python
Install Python itself from the download page. Download the 'Windows x86-64 MSI
installer' for 64bit systems or 'Windows x86 MSI installer' for 32bit systems. Click
on the link. During installation, choose C:\apps as the installation folder and select
the option to add Python to the PATH environment variable. If you don't get this
option, or missed it, edit the PATH environment variable and add the following:
C:\apps\Python27\;C\apps\Python27\Scripts; More information on settings envir-
onment variables in Windows can be found at the Computer HopeComputer Hope
webpage.
lxml
The package python-pptx requires the lsml tools. Install binary for lxml from the
lxml page. An example for version 3.4.4. After clicking on the version, download
the Windows 64 python 2.7 package. At time of writing: l xml-3.4.4.win-amd64-
py2.7.exe . After downloading, double click on the exe file and follow onscreen
instructions.
Python-pptx
- 806 -
Open a Command Prompt (press Windows+R, type in cmd, and hit enter) and run:
pip install python-pptx.
Linux
Start by checking if python has been installed. Open a command line window and
enter: python. If it's installed, you'll see some information in the terminal, like the
version number. Note that currently python-pptx requires Python 2.6, 2.7, 3.3 or 3.4.
If python is not installed, install using your distribution's package manager, or ask
your system administrator. python-pptx is hosted on PyPI installing with pip is
simple. First install lxml: pip install lxml. The install python-pptx: pip install python-
pptx and you should be ready to go.
l Scene: the scene will be captured in an image. if you have multiple scenes, choose
the scene you'd like to add. The name of the selected tree item will be the name of the
slide.
l Window: image of the selected window will be added. Note that windows on top of
the selected window will also be captured in the image.
l Desktop: an image of the full desktop will be added.
- 807 -
9.3 Installation
- 808 -
9.3.1 Update (Installation Manager)
Some improvements in the installation manager:
The Installation Manager is available for download via the appropriate platform link
on the download page of the OpendTect website.
- 809 -
The installation manager is a wizard to install/upgrade the existing OpendTect
(Current / Previous) releases. The release type field is used to select the release
that is needed to be installed/upgraded. The installer gives you the choices as
seen below:
- 810 -
The information following in this section deals with online installation or upgrade.
For creating offline installation packages, please see Offline Installation
The figure above suggests to select the package type of OpendTect. To read more
about OpendTect packages type, please refer to our web-page of licensing types.
- 811 -
9.3.1.1 Package Manager
The last window of the wizard is the OpendTect Package Manager (see above fig-
ure). Multiple items can be selected from the list by checking the boxes (or not).
Optionally, the relevant package combination could also be selected from the top
list box.
The installation manager will automatically recognize the previously installed ver-
sion at the selected path and will prompt it in the Installed version field.
To read more about a particular item in the list, select the item by clicking on it and
read the description on to the right panel. For example, Dip-Steering:
- 812 -
If, for any reason, you should choose to abort the installation mid-download, you
will see the following window appear:
- 813 -
This gives you various options, including increasing the time-out from its default
setting, changing the download server or changing the Proxy settings.
- 814 -
9.3.1.1.1 Utilities Menu
On the top left corner of the package selection window there is a Utilities menu,
which offers some useful functions for the installation manager:
- 815 -
9.3.1.1.1.1 Export Download List
This option allows the user to download the list of URLs of the individual packages
from the download site. This list is stored in a text file which can be used later to
download these files directly without the help of the installer program. After down-
loading, user can run his/her own unzipping scripts to install the packages manu-
ally. This facility was only developed for the Linux users. Windows users can use
this feature, provided they can prepare their own installation scripts for the install-
ation.
- 816 -
9.3.1.1.1.2 Rollback
Rollback tool allows you to restore your previous version of the installation. If after
updating the software you feel uncomfortable with some of the new features and
want to go back to your previous installation, you have to use this tool. As this tool
will change your entire installation so you have to use it cautiously.
- 817 -
9.3.1.1.1.3 Show Log File
The installation manager keeps track of all the action it is executing in a log file.
This log file can be viewed from this tool. This is useful for debugging purposes. If
you face any trouble during the installation process you can send this file to
OpendTect support if needed.
- 818 -
9.3.1.1.2 Offline Installation
You may also choose to create packages for offline installation. These packages
are created in such a way as to function cross-platform. For example, you may
download the Linux 64bit package onto a Windows machine and then transfer and
install it onto the Linux system or vice-versa.
You will need to select the OpendTect version and toggle to 'Prepare offline install-
ation packages'. You may either choose your download directory or leave the
default.
You will first be prompted to select the package type of OpendTect. To read more
about OpendTect packages type, please refer to our web-page describing licens-
ing types:
- 819 -
On completion of the download, you will be reminded of the location in a pop-up
window and informed of how to launch the installation package:
- 820 -
Linux offline pop-up info
The 'Platform' option refers to the intended installation platform, and not the plat-
form of the machine currently being used to download the packages (if different).
(For information on how to verify packages installed offline, please see the link
below:
- 821 -
9.3.1.1.2.1 Package Verification
We generate signature files for all packages. Normally, a package is a zip file
downloadable from our website:
For each zip-file, there is a zip.sig file containing a digital certificate that can be
used to verify that the package has not been tampered with during transit.
To verify a package, download the corresponding zip.sig file and place it in the
same directory as the package file. gpg (or pgp) must be used (an encryption pro-
gram). These programs are normally installed in most linux installations, and can
be found at the GNU license page
dGB's public key has to be downloaded to your keyring. This is only necessary for
the initial verification. To obtain the key, use gpg itself:
You may chose any keyserver you want, as they all share data. Once you have loc-
ated our key, import it to your keyring.
Secondly, to avoid warning messages, edit the key and tell gpg that you trust it:
and then type "trust" as command. Once you have our key installed, you are ready
to verify the packages. This is done by gpg:
- 822 -
This will check that demosurvey.zip has not changed since the file demos-
urvey.zip.sig was generated in our office. A positive output may look like this:
l gpg: Signature made Thu Oct 4 08:46:01 2012 CEST using DSA key ID A02F407E
l gpg: checking the trustdb
l gpg: 3 marginal(s) needed, 1 complete(s) needed, PGP trust model
l gpg: depth: 0 valid: 1 signed: 0 trust: 0-, 0q, 0n, 0m, 0f, 1u
l gpg: next trustdb check due at 2022-03-13
l gpg: Good signature from "dGB Earth Sciences B. V. (Software package signing key)
"
- 823 -
9.3.2 Auto-Update Policy
The auto-update policy can be defined and changed by a user. By default the
option is set to [Inform] when the updates are available. On Windows, this can be
changed to [None] Never check for updates should you prefer.
- 824 -
9.3.3 Connection Settings
To enter the proxy information, the correct proxy server information must be added
in the Connection Settings before running the installation. This is done in the fol-
lowing dialog. This dialog is also available directly through the Installation Man-
ager on clicking the Proxy Settings button.
- 825 -
9.3.4 Plugins
The plugins window lists the plugins that are currently loaded (or not) into
OpendTect, and provides relevant license information.
Developers might want to use the option "Load a plugin" to manually load their plu-
gin. The developers documentation describes how to add a plugin to the automatic
loading procedure.
In OpendTect, there are several commercial plugins available. Each plugin adds
extra functionality to OpendTect. To load a new plugin, browse to the appropriate
file. More information on plugin design is available in the Programmer manual.
In general most plugins are loaded automatically at startup, based on the chosen
options:
- 826 -
If you choose to toggle off the option "Show this dialog at startup" all plugins will be
loaded at startup. It is recommend to install only the plugins for which you do have
a license and to load them all automatically at startup.
- 827 -
9.3.5 Setup Batch Processing
In order to utilize OpendTect's capability for Multi-Machine Processing (MMP), a
BatchHosts file must be created and used. This file contains the list of remote
machines ( host machines or nodes ) and some relevant details about these
machines and the path to the Survey Data Root. OpendTect will use this file to com-
municate to the remote hosts and launch processes remotely on them. Follow the
example format (shown below) to add the list of remote machines and their details
in the respective fields.
BatchHosts file: This field is not editable in the User Interface. It is set as a user
environment variable:
- 828 -
IP address: IP address of the node machine(s)
Display name: Free-text field. Text entered here appears in the Multi-Machine Pro-
cessing window.
Survey data root: Location of the survey (the path to the survey data root folder
from the host machine)
Advanced Settings: Here you may change the first port value (in the case that it is
blocked/in use). Linux users may decide to change the shell command from the
default ssh to rsh. The Nice level sets the priority on the host machines, 19 being
nicest and 1 being least nice). Finally, the Default Data Root can be set per plat-
form:
- 829 -
Add new host.
Test hosts. Will perform tests to ensure that the server and nodes can com-
municate to the necessary extent to perform the MMP. (ie: can the nodes find the
data root folder and read/write into it)
- 830 -
For more information on this topic, please refer to OpendTect's Youtube Channel
where you may find the webinar: Multi-Machine Processing Setup.
- 831 -
9.3.6 Licenses
Under Utilities--> Installation--> Licenses you will see two sets of options, differing
per platform:
For information about floating or server-based licenses, please refer to the flexnet
installation guide page
For more general information about OpendTect licensing options, please see the
support licenses page
- 832 -
A more complete explanation of OpendTect license Installation can be found in the
License Installation Webinar, available on OpendTect's Youtube Channel or via
the webinar page
- 833 -
9.3.6.1 Install demo/node-locked license
Plugins to OpendTect can be run either by using a license server or by using demo
(evaluation) licenses. This second case is case called "node-locked license install-
ation".
Use the following window to specify the path to the node-locked (demo/evaluation)
license files that were given to you:
- 834 -
Here you can install each of the licenses by simply clicking ‘Select’, choosing the
appropriate license file and clicking ‘Ok’ in the file selection window. Once you
have selected all the licenses you are evaluating, click ‘Ok'. Your installation will
be confirmed and you will be prompted to re-start OpendTect:
In addition to the core functionality, you may click 'Request Evaluation License' to
bring you to the following web-form on the dGB website: request a demo page
And, on Windows only, use the 'License Manager Tools' button to pop up the Flex-
era LM Tools window for more direct access to it features:
- 835 -
- 836 -
9.3.6.2 Clear License Installation
This option (Windows only) will clear:
l Demo or node-locked licenses installed via any route, including the 'Install demo
license' option.
l Floating (or 'server') licenses that may have been installed (without stopping the
license server).
Users of Linux systems wishing to clear their license installation will need to do the
following:
This method also applies to both demo/node-locked and floating licenses and will
also not stop the server.
- 837 -
9.3.6.3 Show HostID
Clicking this option will pop up a simple dialog showing the HostID of the machine:
Additionally, on Windows, accessing the HostID of the machine can be done via
the LM Tools (available via the Start Menu or directly from
..\OpendTect\5.0.0\bin\win64\lm.dgb\lmtools.exe):
- 838 -
The option 'Save HOSTID Info to a file' will simply save the information displayed
above into a .txt file for reference.
- 839 -
9.4 Show Log File
The user can check the log file from Utilities > Show log file. This will show the log
of low traffic signals e.g. warning messages if a plugin (or license) is not properly
loaded.
- 840 -
10 Help
From this drop-down menu, some of the various help aids can be accessed. This
includes the OpendTect User Documentation (F1) and the dGB Plugins User Docu-
mentation (F2).
- 841 -
11 Appendix A - Attributes and
Filters
OpendTect attributes and filters are divided into 'dip-steering attributes' and 'no-
steering attributes'.
A comprehensive list of all the attributes and filters available in the free, open
source OpendTect core software is shown below (the so-called 'no-steering attrib-
utes'):
Note that other attributes and filters are only available if the corresponding plugin
has been installed. Information on commercial dGB plugins, please read the dGB
Plugins Documentation.
Attributes related to Earthworks and ARK CLS plugins are documented in their cor-
responding UserDocs. These can be found by using following link:
- 842 -
l Prestack -- The prestack attribute can be used either to extract statistics on the gath-
ers and their amplitudes, or to extract AVO attributes
l Reference -- Attribute that returns the definitions of the extraction position
l Reference shift -- Attribute that moves the extraction position in 3D space
l Sample Value -- Attribute that returns the input value at the sample location
l Scaling -- Attribute used for scaling of amplitude
l Similarity -- Multi-trace attribute that returns trace-to-trace similarity properties
l Spectral decomposition -- Frequency attribute that returns the amplitude spectrum
(FFT) or waveletcoefficients (CWT)
l Texture -- Group of attributes that return statistical properties of a Grey-Level Co-occur-
rence Matrix (GLCM)
l Texture - Directional -- a multi-trace attribute that returns textural information based on
a statistical texture classification.
l Velocity Fan Filter -- Attribute that returns energy with apparent velocities/dips inside
a specified Min/Max range
l Volume static -- Attribute that returns statistical properties
- 843 -
11.1 CEEMD - Spectral Decomposition
Name
Description
1. The number of extrema and the number of zero-crossings must either be equal or dif-
fer at most by one
2. At any point the mean value of the envelope defined by the local maxima and the
envelope defined by the local minima is zero.
These conditions ensure that IMF’s contain only one mode of oscillation per cycle
whereby a cycle is defined by the zero crossings. No riding waves are allowed.
Riding waves can result in negative instantaneous frequencies, a major problem in
any application that relies on instantaneous frequencies. An important char-
acteristic of IMF’s that is utilized in EMD is that instantaneous frequency can be
defined everywhere.
To decompose the signal into the IMF components the algorithm performs a pro-
cess called sifting. In sifting the local mean of the signal is subtracted from the sig-
nal. The local mean is computed from the envelopes (below). If the difference
signal fulfils the IMF conditions defined above the first component is found. This
will be the component with locally the highest frequencies. This component is sub-
sequently subtracted from the original signal and the process is repeated until all
components have been found. The last component contains the lowest fre-
quencies, or represents the trend.
- 844 -
The sifting process. Envelopes of the signal are constructed by fitting a polynomial
function through picked minima and maxima. From the envelopes the local mean
of the signal is computed, which is subtracted from the input signal. It is then
checked whether the difference signal (bottom) meets the IMF conditions. If not, the
sifting process is repeated until the IMF conditions, or the sifting stopping criteria,
are met.
EMD is a relatively slow decomposition method and it has a problem called mode
mixing. This is defined as either a single IMF consisting of widely disparate scales,
or a signal of similar scale captured in different IMF’s.
To overcome mode mixing two noise assisted methods have emerged, both of
which are supported in the OpendTect attribute.
- 845 -
solves the mode mixing problem and it provides an exact reconstruction of the
input signal. CEEMD is however, a CPU intensive process and in the current imple-
mentation rather slow.
Input Parameters
- 846 -
To evaluate the results along one horizon, e.g. with RGB blending, it is
faster to run a batch process on a time slice that encompasses the horizon-
slice than to run the job interactively using the evaluate attribute option.
l EMD – slow and possibly suffers from mode-mixing (see Description above)
l EEMD – slower but partly solves mode-mixing, however signal cannot be recon-
structed exactly
l CEEMD – slowest but solves the mode-mixing problem and the signal can be recon-
structed exactly from the components.
Maximum No. of IMFs is the maximum number of components into which a signal
can be decomposed.
IMF threshold is a value below which the decomposition process is stopped. The
value is computed as the standard deviation of the component divided by the stand-
ard deviation of the input signal.
Maximum no. of sifts and Sift threshold are stopping criteria for the sifting pro-
cess. The Sift threshold is defined as the standard deviation of the signal after the
sifting step divided by the signal before the sifting step. The typical range is
between 0.2 and 0.3
Display Time Frequency Panel will decompose the selected trace and displays
the result in a time-frequency plot with frequencies ranging between 0 and Nyquist
Hz.
- 847 -
- 848 -
Time-Frequency plot of the synthetic test trace, below(zoomed in to show only fre-
quencies from 0 to 150Hz.)
Output
Peak Frequency is the frequency with the largest amplitude in all IMF com-
ponents. It captures information from the spectral decomposition into a single attrib-
ute that is related to tuning effects at varying thicknesses.
IMF Component outputs the IMF components (below). This corresponds to the
real parts of the decomposed signal. The first component corresponds to the
highest frequency oscillations in the signal. A decomposition may result into an
unknown number of components. When running a job in batch mode the output is
stored in a multi-attribute file (cbvs format) with N+1 number of attributes where N
is the maximum no. of IMF’s. If a trace is decomposed into less than N components
the remaining attributes are filled with zeroes. The N+1 attribute contains the aver-
age of the input trace (= DC component) that was removed at the start of the decom-
position.
- 849 -
Synthetic trace (top) decomposed with CEEMD into its IMF components (bottom).
Examples
Line 425 of F3 Demo is the input for the decomposition with CEEMD. The IMF com-
ponents; Peak Frequency components; and Peak Amplitude are shown below
- 850 -
- 851 -
IMF components (CEEMD) of Line 425.
- 852 -
Peak Amplitude (CEEMD) of Line 425.
References
Han J. and Van der Baan M., 2013. Empirical mode decomposition for seismic
time-frequency analysis. Geophysics, 78 (2), O9-O19.
Huang, N.E., Shen, Z., Long, S.R., Wu, M.C., Shih, H.H., Zheng, Q., Yen, N.C.,
Tung, C.C. and Liu, H.H., 1998. The empirical mode decomposition and the Hilbert
spectrum for nonlinear and non-stationary time series analysis: Proceedings of the
Royal Society A: Mathematical, Physical and Engineering Sciences, 454, no.
1971, 903-995.
- 853 -
11.2 Convolve (2D & 3D)
Name
Description
The input data is convolved with a three-dimensional kernel specified by Filter type
and associated parameters. Lowpass, Laplacian, and Prewitt are well known filters
in image processing.
Input Parameters
Lowpass
- 854 -
Laplacian
In case all sample values are equal and non-zero (either positive or negative), the
effect of this operation is zero.
- 855 -
Prewitt
Prewitt is a contrast enhancement filter. This filter computes the gradient in dif-
ferent directions from a 3x3x3 input cube. The Output is returned in the specified
direction (inline, crossline, or Z-plane). A 3x3x3 Prewitt kernel to calculate a hori-
zontal gradient is given by:
Note that the inline gradient returns the difference in amplitude in the inline dir-
ection. This is best visualized on a crossline. Similarly, a crossline gradient is visu-
alized best on an inline.
A 3x3x3 Prewitt kernel that returns the vertical gradient is given by:
- 856 -
Wavelet
This option enables the user to convolve the data with a wavelet. In this context, a
wavelet is the time series response of a filter. The wavelet should be imported into
OpendTect first or it can also be created in OpendTect (From Wavelet man-
agement or ARKCLS Spectral blueing attribute).
- 857 -
11.3 Delta Resample
Name
Description
This attribute is used to make shifts inside seismic volumes. By defining an input
cube and a delta cube, which represents the shifts that should be applied to the
input data, a correctly aligned output volume can be generated. The delta cube can
be generated by using the attribute "Match Delta". Note that you must use the Z-
unit of the input cube. Also note that you must apply a negative shift to move the
cube down and a positive shift to move the cube up. Using the "Delta Resample"
attribute is very useful in case of working on, for example, multi-azimuth volumes
which frequently show some degree of misalignment. This technique can also be
very useful for time-lapse seismic data and NMO-corrected data. Only after align-
ing, a correct comparison can be made between two different volumes.
Input Parameters
The delta cube can be generated by using the Match Delta attribute. An advanced
option in the attribute engine is that, when the input is periodic and contains
phases, it is possible to define a (maximum) period. The box "Input is periodic"
should be clicked and the period should be defined in the box. However, a situ-
ation like that is very exceptional and, in 99 out of 100 cases, the shift can be
applied without using this option.
One of the side-effects of this residual alignment is that existing horizons are not
consistent with the data any more. The horizons should be re-snapped to the
aligned seismic data by using the option "Algorithms" when right-clicking a horizon
in the tree. The third option "Snap to event" enables the user to make the horizon
- 858 -
consistent again. Only after snapping the horizons can the user, for example, cal-
culate horizon-steered attributes or performing waveform segmentation.
Another useful application for this attribute is to flatten or unflatten a cube. The
example given below outlines how you would unflatten a cube that you have
flattened (for example, using the Flattening option under the Horizon Workflows
fold-out):
You would create a cube that would have, as a function of the position only, the Z
difference between flattened and unflattened volume. For example, you may have
set the TWT of the flattened cube to 1000 ms. Using the Horizon attribute you
would read the TWT values of the horizon, let's say 1300ms for a particular trace.
The shift is thus defined with a mathematics attribute as:
Applying this shift attribute as 'Delta Cube' for your flattened volume (your 'Input
Cube') will undo the flattening (in this case, with a negative value, thus shifting the
cube down).
- 859 -
11.4 Energy
Name
Description
This attribute calculates the squared sum of the sample values in the specified
time-gate divided by the number of samples in the gate. The Energy is a measure
of reflectivity in the specified time-gate . The higher the Energy, the higher the
Amplitude. This attribute enhances, among others, lateral variations within seismic
events and is, therefore, useful for seismic object detection (e.g. chimney detec-
tion). The response energy also characterizes acoustic rock properties and bed
thickness.
Input Parameters
Output
- 860 -
Examples
- 861 -
11.5 Event
Name
Description
The event attribute is a useful tool when determining the quality of horizons in seis-
mic data that can also be applied to in-lines, cross-lines, or z-slice elements. A
sketch of the several event attribute applications is shown below:
Input Parameters
- 862 -
In the single event mode, the algorithm searches for the extremum and quantifies
the shape around the event in terms of either Peakedness, Steepness or Asym-
metry.
l Peakedness: The ratio between the Extremum value and distance between next and
previous zero crossings (ZC)
l Steepness: The slope of tangent to the seismic trace at a zero crossing
l Asymmetry: The asymmetry of event. Mathematically it can be presented as: (L-R)/
(L+R) where L is the distance between previous ZC and extremum and R is distance
between next ZC and extremum
In the multiple event mode, the event type needs to be specified from the Event
type drop-down list:
l Extremum
l Maximum
l Minimum
l Zero Crossing
l Negative to Positive Zero Crossing
l Positive to Negative Zero Crossing
l Maximum within the gate
l Minimum within the gate
- 863 -
Output
The output is either the distance between the chosen event and the next/previous
similar event or the amplitude of the event. The output is determined by the check-
box below the event type drop-down list.
For the event types Maximum within the gate and Minimum within the gate, a time-
gate (in milliseconds) needs to be specified. The algorithm computes the distance,
within the specified time-gate from the current point, e.g. on a horizon, to the
nearest maximum or minimum.
- 864 -
- 865 -
The Event attribute is for example useful while quality-checking horizon grids. The
attribute can also aid in finding the distances between two events and using it as
an estimate of relative thickness changes between them. The adaptive use of
Volume Statistics along with the Event attribute can also bring relative changes to
derive meaningful geological aspects.
- 866 -
11.6 Fingerprint
Name
Description
This attribute computes the similarity between a user-defined vector of attribute val-
ues and the equivalent vector taken at each sample position inside the cube. The
reference vector can be constructed from one or more positions. A statistical prop-
erty (average, median, variance, minimum or maximum) is calculated after the con-
struction of the vector. Also, it is possible to construct the vector manually by
editing the attribute values of the fingerprint vector. The similarity between the fin-
gerprint vector and the equivalent vector at the evaluation point is computed as the
normalized Euclidean distance between the two vectors and ranges between 0
(vectors are not identical at all) and 1 (vectors are 100% identical).
If you want to insert/remove the reference attributes, right click in the empty area of
Reference attribute window.
- 867 -
Advanced options
In order to compare vectors, you can use the "Advanced" option to obtain the
ranges for the input values. The ranges are automatically calculated using random
points when you press the button "Calculate parameters". You can also use a point-
set to find the ranges or even manually introduce them. In the "Advanced options"
window, a weight can be assigned to each individual attribute. The default value of
the weight is set to 1.
- 868 -
Example of result application fingerprint attribute
- 869 -
11.7 Frequency Filter
Name
Frequency Filter -- Attribute that returns filtered data using FFT or Butterworth filter
types
Description
The specified Input Data is bandpass filtered with the commonly used Fast Four-
ier Transform or Butterworth filter.
Input Parameters
- 870 -
The difference between using the FFT or Butterworth filtering method is that, for the
FFT, one considers the complete trace while for the Butterworth filter, only a
"small" segment, depending on the selected number of poles, is taken into
account. The user should keep in mind that using the Butterworth Filter results in a
small shift in the seismic data.
- 871 -
The curves of the Butterworth filter for various numbers of poles
Frequency Taper:
- 872 -
Frequency taper settings in the attribute definition (for filter type: 'BandPass').
Once the Frequency Taper is displayed (red line in a pop-up window), a spectrum
on any inline/crossline can be viewed by pressing the Preview Spectrum button. It
will prompt to select an inline/crossline ("Select line from Data"). In that dialog
select either inline or crossline radio button and press Next. Sub select the part of
inline/crossline and proceed. A blue coloured amplitude spectrum will be dis-
played (as shown below). Now adjust the parameters (Slope or Start/Stop Fre-
quency) and finalize the frequency taper settings.
- 873 -
Interactive display of the frequency taper parameters (for filter type: 'BandPass').
- 874 -
11.8 Frequency
Name
Description
Input Parameters
- 875 -
The specified time-gate is transformed to the Fourier domain and the requested out-
put is calculated. The time-gate is tapered with the specified Window/Taper prior to
Fourier Transform. The shape of the various tapers is shown in the figure below.
- 876 -
Output
Dominant frequency
- 877 -
Returns the dominant frequency from the frequency spectrum--the frequency with
the highest amplitude
Average frequency
Median frequency
Returns the weighted median value of the frequency spectrum, which is the fre-
quency at half the spectral area on each side. The median frequency might be
somewhat more robust than average frequency, at the cost of lower precision.
Returns the maximum amplitude of the frequency spectrum, i.e. the amplitude of
the dominant frequency.
Returns the spectral area beyond the dominant frequency (see figure below).
- 878 -
11.9 GapDecon
Name
Description
The type of multiple removal algorithm chosen for this application is the well
known inversed filtering method also known as Gap deconvolution. This filter aims
to attenuate a user-defined part of an auto-correlation function. The underlying idea
is that multiples in the data are secondary reflections, i.e repetitions of the primary
reflections that show up in the auto-correlation function at a time that corresponds
with the extra travel time. The filter can be applied on-the-fly or in batch-mode to
produce a filtered output cube.
l The energy attribute is first calculated in the user mentioned window(by default [0,0])
l A Hilbert transform is applied on this Energy amplitude
l The output of the above step is multiplied by -1 to get the phase rotation of +90
degree
The user defines the GapDecon attribute from the list and specifies a number of
input parameters.
- 879 -
l Output is = The user may set the output to zero phase, then the inverse 90 degrees
rotation is applied after the filtering.
l Check parameters = check whether the (multiple) energy has indeed been removed.
A QC of the parameters : the GapDecon-filter with the selected parameters, lag 44
and gap 100 (see example below), is applied on a user defined line and the auto-cor-
relation of the filtered data is displayed in a 2d-viewer to check whether the para-
meters are correct. If they are not correct, the parameters can still be changed.
Spiking Deconvolution.
The Gap Decon attribute can also be used for Spiking Deconvolution or Whitening.
The goal of Spiking Deconvolution is to flatten the output spectrum. This is
achieved by shortening the embedded wavelet and attempting to make it as close
as possible to a spike (zero-lag spike). One should keep in mind that the frequency
bandwith of the data might limit the extent to which this whitening is possible. At
higher frequencies, Spiking Deconvolution might cause an increase in noise.
Example
- 880 -
In the attribute defined above, we define an auto- correlation window between
300ms and 1200ms. After pressing 'Examine' we can see that the following para-
meters could give us the result we desire: a lag of 44 ms and a gap of 100ms.
Pressing 'Check Parameters' shows the effect this would have if we were to output
this attribute with the current parameter settings:
- 881 -
- 882 -
11.10 Grubbs Filter
Name
Grubbs Filter -- Attribute that removes outliers from normally distributed data.
Description
Grubbs' test , (also known as the maximum normed residual test), is a statistical
test used to detect outliers in a univariate data set assumed to come from a nor-
mally distributed population. It is based on the assumption of normality. That is,
one should first verify that the data can be reasonably approximated by a normal
distribution before applying. The test detects one outlier at a time. This outlier is
expunged from the dataset and the test is iterated until no outliers are detected.
Please note: Multiple iterations change the probabilities of detection, and the test
should not be used for sample sizes of six or less since it frequently tags most of
the points as outliers.
For a full definition, including formulas, please see the Wikipedia entry.
Input Parameters
- 883 -
11.11 Horizon
Name
Description
This attribute is designed to extend the use of horizon data and can be used for
several purposes. When applying it to a horizon only, it will give similar results as
calculating attributes on the horizon.
However, when using the output of this attribute as input to a mathematical expres-
sion, the output can vary from just a simple thickness calculation to a highly
advanced combination of several attributes. You can think of this attribute as dif-
ferent to the other attributes in OpendTect in the sense that it takes horizon data
and converts it into seismic data which can be applied as input in volumes.
Input Parameters
- 884 -
As shown above, there are two possible outputs, namely Z (depth) and horizon
data. For each position on the horizon, the value (Z or horizon data) is used as out-
put along the complete trace.
- 885 -
11.12 Instantaneous
Name
Description
The imaginary part of the complex trace is computed via the Hilbert transform.
Possible outputs are:
- 886 -
Instantaneous Amplitude (Trace Envelope)
Outputs the instantaneous amplitude (or envelope) of the selected data volume at
the sample location.
Instantaneous Phase
This attribute is of central importance since it describes the location of events in the
seismic trace and leads to the computation of other instantaneous quantities.
- 887 -
The instantaneous phase makes strong events clearer and is effective at high-
lighting discontinuities of reflectors, faults, pinch-outs, angularities and bed inter-
faces. Seismic sequence boundaries, sedimentary layer patterns and regions of
onlap/offlap patterns often exhibit extra clarity
l Cosine phase: Cosine of the instantaneous phase, also called normalized amp-
litude. It has the same uses as instantaneous phase with one additional benefit: It is
continually smooth. By providing the +/-180 degree discontinuity that occurs with
instantaneous phase, the cosine of instantaneous phase can be further processed
(e.g, filtered and stacked) using conventional seismic processing tools. Amplitude
peaks and troughs retain their position, but with strong and weak events now exhib-
iting equal strength
l Envelope weighted phase: Instantaneous phase, weighted by the envelope over the
given time window
l Rotate Phase: Phase output is rotated through a user-specified angle
Instantaneous Frequency
It uses include:
- 888 -
Hilbert (Quadrature Amplitude)
The quadrature trace is the imaginary part of the complex seismic trace (see image
above), and can be computed from the real trace via the Hilbert transform.
Both the real trace and its quadrature counterpart share the same amplitude spec-
trum; the quadrature however is phase rotated by 90 degrees. Zero-crossings on
the real trace transform to peaks and troughs on the quadrature trace and peaks
and troughs on the real trace transform to zero-crossings on the quadrature trace.
- 889 -
11.13 Localized Velocity Fan Filter
Name
Velocity Fan Filter -- Attribute that returns energy with apparent velocities/dips
inside a specified Min/Max range.
Description
The velocity fan filter passes energy with apparent velocities (for Time surveys) or
apparent dips (for Depth surveys) inside the specified Min/Max velocity/dip range.
The filter supports three options:
Therefore, this attribute can be used to filter out or enhance certain dip/azimuth
events.
Input Parameters
The Filter size is the size of the 3D kernel. Filter size 3 means the data is con-
voluted with a 3x3x3 kernel. To reduce edge effects it is recommended to apply a
cosine square taper. A Taper length of N means (100-2N)% of the specified velo-
city range will be flat. Azimuth filter is a special option that allows the dipping
energy to be passed inside the specified Azimuth to pass direction only.
- 890 -
The different shapes of the filter ( low pass, high pass, interval velocity/dip) are
shown below:
- 891 -
Note: Please be aware that in Time surveys, flat events have infinite velocity, and
vertical events have a zero velocity. The opposite is observed in Depth surveys
- 892 -
where not velocities but dips are used: Horizontal events have zero dip, while ver-
tical events have 90 degrees dip.
- 893 -
- 894 -
This option allows you to display a two-dimensional Fourier transform over time
and space where F is the frequency (Fourier transform over time) and K refers to
wave-number (Fourier transform over space).
Examples
- 895 -
11.14 Log
Name
Description
This attribute takes the value form an input well log and returns this value through-
out the volume at the corresponding Z value.
Input Parameters
Firstly, the 'Input Well' is chosen and, secondly, one of the associated logs from the
auto-filled 'Select Log' list.
Output
Examples
Below is a shot of the Gamma Ray log from F03-4 displayed on inline 441 (F3
Demo dataset), as 'Take Nearest Sample':
- 896 -
- 897 -
11.15 Match Delta
Name
Match Delta -- Attribute that extracts time shifts between similar events in different
seismic volumes
Description
This algorithm extracts the time difference delta t between peaks in different seis-
mic volumes. A search window is set to avoid loop-skips. After extracting all delta t
values, they are interpolated. The resulting cube is the delta cube in ms or
meter/feet which can be used for the Delta Resample attribute.
- 898 -
Extraction of delta t from two neighboring traces.
- 899 -
In a little more detail, the algorithm proceeds along these lines:
After using this attribute, it will be clear that there are many 'jumps' and 'skips' in
the displayed sections. Because we know that the deltas should not vary that
quickly - not going from trace to trace, but also not going from sample to sample,
these can be effectively removed by utilizing a double filter: first a rather large
median filter over many traces and a 2 to 3 samples, then an average or FFT-
based filtering in all directions. The resulting output should now be relatively
'noise-free'.
Input Parameters
Besides the reference and the match cube, a maximum time-window must be
defined in order to avoid loop-skips.
- 900 -
11.16 Mathematics
Name
Description
The Mathematics attribute is a way to combine data from stored data and attributes
into a new data.
The input data can be volumes, 2D lines or well logs. The output data has always
the same dimension (3D, 2D, 1D respectively) as the input, with the exception that
3D and 2D attributes can also be computed along surfaces (horizons and faults).
The mathematical expression is calculated at each sample position.
Input Parameters
A mathematical expression is specified in the Formula text field with variables, con-
stants, numerical values and/or recursive operators.
l Constants must start with the letter C and be following by a number: C1, C5, C9... The
number does not necessarily start at 0 and does not have to be consecutive. Con-
stants can be evaluated just like stepouts and time gates for the other attribute. They
are not available for logs creation
l Variables expressions can be any string: seis, energy, sim, ... They can be called with
a number between brackets like seis[-1] or seis[4], in which case the value represents
a shift in number of samples. seis[-1] represents seis on the sample above, seis[4] rep-
resents seis four samples below
l Recursive expressions are a way to call back the result of the mathematical formula
on a sample above for computation. This result is called with the expression OUT[-i]
(case insensitive) where i is a number of samples
- 901 -
There is no limit in number of variables/constants used in an expression.
There are special constants for which it is not necessary to provide the numerical
value: DZ is the sampling rate of the input data, Inl and Crl are respectively the
inline and crossline numbers of the current trace. The special constant are case
sensitive.
Parentheses ( ) are allowed. Once the mathematical expression has be entered
you must press Set. Then a table will appear where you will be able to assign data
to the variables and values to the constants and recursive settings: Start time (or
depth) of the recursive function and value at this time.
- 902 -
Supported operators are:
+, -, *, /, ^, >, <, <=, >=, ==, !=, && (and), || (or), |xn |(absolute value), cond ? true stat
: false stat.
sin(), cos(), tan(), asin(), acos(), atan(), ln(), log(), exp(), sqrt(), min(), max(), avg(),
sum(), med(), var(), rand(v) and randg(std).
Where avg is the average, med is the median, and var is the variance of the input
parameters. The input parameters in parentheses should be separated by a
comma.
The function rand(v) gives a random number from a uniform distribution between 0
and v. The function randg(std) generates a random number from a Gaussian (nor-
mal distribution) with standard deviation std and expectation 0.
The three parts can be any set of variables and/or constants. IF..THEN..ELSE state-
ments can be embedded like here:
Other options
Predefined functions, constants, and operators can be inserted into the 'Formula'
field using a combination of the drop-downs, followed by the 'Insert' button. These
are grouped so:
- 903 -
The grouping 'Other' contains the above.
Examples of expressions
Let's say you have one input cube, say 'Cube1' and one attribute already defined,
Energy40, which is the Energy per sample calculated in a [-20,20] ms window
around the current sample. Then, you could define 'Damped amplitude' as;
Additional examples
- 904 -
1. Centred differentiation example: Centred differentiation can be coded using the for-
mula (seis[+1]-seis[-1])/(2*DZ) where DZ is the sampling rate. Please note that for lat-
eral shifts the reference shift attribute must still be used.
2. Recursive filters can be created using the syntax "OUT[-i]". The most general form of
recursive equation is the following:
l y[n] = a0*x[n] + a1*x[n-1] + a2*x[n-2] + ...
l where x[] is the input volume, y[] is the output volume and the a's and b's the coef-
l where OUT[-1] stands for y[n-1] and OUT[-2] stands for y[n-2]
l For each instance of OUT[-i] a starting value and attached time/depth must be
provided.
l Two examples of low pass and high pass recursive filters are provided in the
Best results are achieved when providing an input of impedance or velocity type.
3. The phase rotation is an attribute available in the Evaluate attribute set and in the
dGB Evaluate attribute set.
l This attribute allows the user to apply a phase rotation of any angle to the data.
l c0 is in degrees.
- 905 -
11.17 Position
Name
Position - - Attribute that returns any attribute calculated at the location where
another attribute has its minimum, maximum or median within a small volume.
Description
The input attribute is the criteria used to determine the position at which the output
attribute has to be calculated. The stepouts and time-gate define the volume in
which the input attribute is evaluated. In case of a 2D attribute as input, the
inline/crossline stepout is replaced by a single trace stepout.
The Operator determines which position is returned from this analysis; the position
of the minimum, maximum, or median of the input attribute. This position is the pos-
ition at which the output attribute will be calculated.
Examples
The position attribute can be used for several purposes. For example, one can
determine where in a small volume the energy is minimal and output the frequency
at the location of this lowest energy. Also, application of the position attribute is an
important step for Fault Enhancing Filtering. In this case, the user takes the Min-
imum Similarity as Input attribute and as Output, for example, filtered data (using
Max as Operator). The stepout is set to, for example, 1 in both inline and crossline
- 906 -
direction and the time-gate is defined [0,0]. The result below shows a sharply
defined fault:
Before applying the position attribute After applying the position attribute
- 907 -
11.18 Prestack
Name
Prestack -- The prestack attribute can be used either to extract statistics on the gath-
ers and their amplitudes, or to extract AVO attributes.
This attribute requires prestack data as input, but will output poststack data. The fol-
lowing workflow is used in the above example:
- 908 -
The pre-processing of the gathers is optional. When used it must be setup in a sep-
arate window either by selecting an existing setup or creating a new one.
The offset range is used to define the extraction window. It requires absolute offset
values, in meter or feet (depending on the survey unit definition). Optionally if
angles were used instead of offsets while loading, then angle ranges can be used.
Once again they have to be in the same unit as in the input trace (SEG-Y) headers.
Statistics:
The above list of statistics can be returned. The highlighted output option provides
the fold of each bin, while average or RMS may be used to generate full or partial
stacks (partial stacking occurs if the offset range defined is not full). At the foot of
this list are also: Sum, SquareSum and MostFrequent
Least Square:
Before the crossplot the amplitudes can be transformed using the following axis
transformation (left hand-side):
- 909 -
Similarly the offset values that are associated with those amplitudes can also be
transformed (right-hand side). This is useful, e.g. to have the sin2 of the angle as x-
axis.
AVO attributes will become AVA attributes if the X axis becomes Azimuth (i.e.
angle). Please note that the azimuths (angles) must be provided in the trace head-
ers when extracting AVA attributes.
The extractable AVO (Least-Square) attributes are Intercept, Gradient, their stand-
ard deviation and the Correlation Coefficient. It is also possible to have angle/offset
constrained AVO attributes; by specifying the required angle/offset range instead of
using the full range. Please note that further transformations can be achieved using
the output as input to another attribute.
- 910 -
11.19 Pseudo Relief
Pseudo relief attribute is applied on seismic in order to create a more consistent
image for an easier interpretation of faults and horizons. It is particularly useful
when applied in 2D.
- 911 -
11.20 Reference
Name
Description
The X, Y, Z, Inline number, Crossline number, Sample number, Inline index, Cross-
line index and Z index position of the reference (extraction) point is returned.
Output
- 912 -
l Z index: outputs the sample number from the top of the survey starting at 1. (Note: this
attribute is identical to sample number if the survey has a negative starting time or
starts at 0.)
The Reference attribute replaces the old Reference time attribute. Attribute sets
containing the old Reference time attribute will be automatically updated to the
Reference attribute, with Z as output, which gives an identical output.
- 913 -
11.21 Reference Shift
Name
Description
The Input attribute is extracted at the shifted position. The original reference (extrac-
tion) point has inline/crossline coordinates (0,0). Relative number 1 means the next
inline or crossline, respectively. In case of input of a 2D attribute, the inline/cross-
line shift is replaced by a single trace shift. The vertical shift is specified in mil-
liseconds using the Time option. There is also the option to use Steering while
calculating the Reference shift.
The Input attribute is extracted at the shifted position. The original reference (extrac-
tion) point has inline/crossline coordinates (0,0) and Time 0.
The attribute will take the value from the shifted position and display it at the ori-
ginal reference point. Say that an original position (inl:0,crl:0,Time:0) has a shift of
(25,25,100) applied to it, then this will take the output value at (25,25,100) and dis-
play it at (0,0,0). (In case of input of a 2D attribute, the inline/crossline shift is
replaced by a single trace/time shift - ie: 25,100.)
It is important to remember that the vertical element of the shift is 'positive upwards'
and 'negative downwards'.
- 914 -
l The value from TWT = 600 will be displayed at TWT = 500, giving the impression
of a upward shift.
l Think of a time-shift of -100:
l Original TWT = 500
l The value from TWT = 400 will be displayed at TWT = 500, giving the impression
of a downward shift.
l Think of an horizontal shift of (Inl:5,Crl:8):
l Original position = 100,150
- 915 -
11.22 Sample Value
Name
Sample Value -- Attribute that returns the input value at the sample location
Description
'Sample value' gives the value of the input volume at the sample location.
- 916 -
11.23 Scaling
Name
Description
Input Parameters
Output
The output amplitudes are always the ratio of the input amplitudes over a weight-
ing function w(i), i being the sample index.
1. Z^n scaling
The weight function is defined by Z^n where Z is the time/depth of the current
sample and n is a user-defined exponent:
- 917 -
The exponent is a float and thus it can be negative, positive and equal to zero
(unity operator). An exponent larger than zero will apply a correction proportional to
the depth, while an exponent smaller than zero will apply an inversely proportional
correction with depth. The output amplitude is the normalized sum of the input amp-
litude using a weight equal to Z^n.
2. Window(s) scaling
The weight function is a step function: w(i) is constant over a static time/depth win-
dow, equal to the "basis" value than is computed from the input amplitudes using
the following mathematical definitions:
Please note that the window time/depths are float and do not need to be on a
sample. The time/depths will be round to the nearest sample when defining the
extraction window. Unlike most of the window definitions in the attribute engine
you must provide in this scaling attribute absolute time/depths values and not val-
ues relative to the actual sample. A weight of 1 (no scaling) will be given to the
samples not covered by a user-defined time gate. The weights are cumulative: If
several windows overlap the output weight will be the sum of the "basis" output for
the samples belonging to multiple windows.
- 918 -
6. Automatic Gain Control scaling
The AGC is a special case of the window scaling. In this case the window is
defined relative to the actual sample and the "basis" is the energy value in that slid-
ing window. The window width is a total size, i.e. the relative width corresponds to
+/- half of the total window width. The low energy mute will mute the output
samples that have an energy lower than a ratio of the trace energy distribution: The
energy of the input trace is computed and the output values are sorted per increas-
ing energy value.
Given 1000 samples, the energy of sample 250 (for a low energy mute at 25%) cor-
responds to the mute level: If the energy computed in the AGC window is lower
than this level the value 0 will be output. Otherwise the sum of the squares over the
number of (valid) samples will be output. Undefined values are not used for the
computation and a zero is output if all values of a time window are undefined. In
other words, the energy of all elements within the defined window are calculated
and then ranked, then the (user-defined) percentage of the lowest energy levels
are muted out.
7. Squeeze
The purpose is to put a limit to the value range of the input. Rather than clipping
the value (which would be equivalent to a simple Mathematics formula like 'x0 > c0
? c0 : x0'), the value can be squeezed into a range.
The first parameter is the 'Value range'. it defines the hard limits to the value range.
One of these limits may be empty, signifying 'unlimited'.
The second parameter is the 'Untouched range'. If no limits are entered there,
Squeeze will degrade to a simple clipping operation. If specified, it will squeeze
rather than clip, constraining the squeezing to the ranges outside this range.
- 919 -
For example, Value range [0,10] and untouched range [2,8]. Values outside the
[2,8] range will be modified to fit between [0,10]. This means the values in the
range [-infinitiy,2] will be squeezed into the range [0,2] via a hyperbolic function.
That function is continuous in value (and first derivative) at 2. Similarly, values
higher than 8 will go somewhere between 8 and 10.
8. Gain Correction
This attribute is used to correct for any undesirable gain applied previously or to
apply a new gain function on the seismic data. This is applied by first selecting the
input data for gain correction and clicking on Analyse button.
- 920 -
The newly popped Select data window requires specifying a number of random
traces in the 'Nr of Traces for Examination' field for visually analyzing and defining
the gain behavior in time/depth. The volume from which these random traces will
be selected is outlined by modifying the inline, crossline and time ranges. Finally,
OK is pressed to begin the examination of the random traces.
The Analyse Gain window has the 'Z' range (in seconds) of the seismic data as the
horizontal axis, while the left vertical axis shows the 'Scale Factor' and the right ver-
tical axis is the 'RMS Amplitude'. The amplitude scale can be set to 'Linear' or 'dB'
(i.e. decibel) for visualization purposes. Further, the 'Scale Range' could be
changed to use a different scale. The 'Gridline step' could be changed as the name
says to modify the gridline steps.
- 921 -
- 922 -
Finally, a 'Gain correction trend' can be defined by moving the red curve such that
for any particular 'Z' interval a specific 'Scale Factor' range is used to scale the seis-
mic amplitudes in that interval. For defining boundary points of these intervals user
can double click on the red curve and move the curve as desired.
- 923 -
11.24 Semblance
Name
Description
Whereas the Similarity attribute works by comparing pairs in given positions, Semb-
lance uses all points within the window to create the score.
Input Parameters
The size of the correlation window is defined by the Time Gate and Stepout, with
the 'Extension' defining its shape.
Steering may be used to enhance the results, if the DipSteering plugin is installed
(with a valid license.)
Time-Gate:
Defines the size of the vertical element in the correlation window (ie: the trace
length to be used in the calculation).
Stepout:
Applies for 'Extensions' of shape Full block, Cross, All Directions and Diagonal.
Defines the extent of the horizontal element in the correlation window (see 'Exten-
sion' below for more detail).
- 924 -
Trace Position:
Applies for 'Extensions' of shape None, Mirror 90 degrees and Mirror 180 degrees.
Defines the positions of the traces to be used in the correlation window (see 'Exten-
sion' below for more detail).
Extension shapes:
l None: Semblance is calculated using only the traces defined in 'Trace positions'.
l Mirror 90 degrees: Semblance is calculated using two additional traces: the defined
trace pair (as in 'None') and the trace obtained by 90° rotation. (Not available for 2D
data)
l Mirror 180 degrees: Semblance is calculated using two additional traces: the defined
trace pair (as in 'None') and the trace obtained by 180° degree rotation.
l Full Block: Semblance is calculated using all possible traces in the column defined
by the time-gate/step-out. Beware of potentially long processing times with large time-
gates/stepouts.
l Cross: Semblance is calculated using all possible traces in the '+' -shaped column
bounded by the time-gate/step-out.
l All Directions: Semblance is calculated using all possible traces in the '✴ ' -shape
bounded by time-gate/step-out. This is the extension found to be most useful: it gives
- 925 -
a degree of accuracy almost equal to that of 'Full Block' but with significantly less pro-
cessing time (depending on the step-out, up to a factor of 10).
l Diagonal: Semblance is calculated using all possible traces in the 'x' -shaped column
bounded by the time-gate/step-out.
References
- 926 -
11.25 Similarity
Name
Description
Similarity is a form of "coherency" that expresses how much two or more trace seg-
ments look alike. A similarity of 1 means the trace segments are completely
identical in waveform and amplitude. A similarity of 0 means they are completely
dis-similar.
In OpendTect, we favor a different approach. We first try to find the direction of best
match at every position, which is a result by itself: the dip. By using this dip we can
then calculate the best Similarity between adjacent traces. Similarity is based on
fundamental mathematics: the samples of the trace are seen as components of a
vector, and the Similarity is defined in terms of distance in hyperspace.
The point about using the Similarity is that it's mathematically simple; it is very
clear what is going on. Then, by combining different kinds of similarities and other
attributes, you can always get much better results with lots less computing time.
The trace segments is defined by the time-gate in ms and the positions specified in
relative coordinates. In case of using input from 2D data, the trace positions are
defined by a trace step-out only, not by inline and crossline stepout. The Extension
parameter determines how many trace pairs are used in the computation. This is
visualized in the image below.
- 927 -
Definition of trace positions relative to the reference point at (0,0).
Input Parameters
Extension Definitions
l None: Only the similarity between the pair of traces defined in 'Trace positions' is cal-
culated.
l Mirror 90 degrees: Two similarities are computed: one for the defined trace pair (as in
'None') and one for the pair obtained by 90 degree rotation. (Not available for 2D
data)
l Mirror 180 degrees: Two similarities are computed: one for the defined trace pair (as
in 'None') and one for the pair obtained by 180 degree rotation.
l Full Block: Similarities between all possible trace pairs in the rectangle defined by the
step-out are computed.
l Cross: Similarities between all possible trace pairs in the '+' -shape defined by the
step-out are computed.
- 928 -
l Diagonal: Similarities between all possible trace pairs in the 'x' -shape defined by the
step-out are computed.
l All Directions: Similarities between all possible trace pairs in the ' ' -shape defined by
step-out are computed. This is the extension found to be most useful: it gives a
degree of accuracy almost equal to that of 'Full Block' but with significantly less pro-
cessing time (depending on the step-out, up to a factor of 10).
The attribute returns the statistical property specified in Output statistic. The Steer-
ing option enables the user to follow the local dip to find trace segments that
should be compared instead of comparing two horizontally extracted trace seg-
ments. The Steering option supports five different modes of data-driven steering:
None, Central, Full, Constant direction Steering and Browse dip.]
Similarity "None" Steering: This option is used when non SteeringCube algorithm
is used. This is ok in the case the layering is mainly horizontal (with less dip).
However, in very complex geology, the similarity result using "None" as steering
option will deteriorate. Full steering should be used instead. The Dip-Steering plu-
gin is required.
- 929 -
Another steering option to use is the "Browse dip". This is a similarity feature act-
ing as a 'Coherency' attribute.
It enables the calculation of 'Similarity' by comparing one trace with the next trace.
- 930 -
Then a value between 0 (not similar at all) to 1 (completely similar) is awarded. In
order to compare traces, two variables should be specified:
The 'Maximum dip' represents the maximum dip in microseconds per metre (μs/m),
relative to an event in one trace, in which the algorithm will look for similar events
along the neighbouring trace. Default is 250.
The "Delta Dip" is a variable which represents the window in microseconds per
metre (μs/m) which is shifted along the neighbouring trace to detect similar events
whithin the earlier specified 'Maximum Dip'. The closer the value to 1 the more pre-
cise the results will be. The default value is 10. Using this value will result in a
good balance between calculation time and quality of the results, this also
depends on the quality of the data itself.
Mathematical description
Xi, i=1,15
Yi, i=1,15
The similarity is 1 minus the Euclidean distance between the vectors divided by
the sum of the length of each vector. Please note that the length of a vector is its L2
norm, also called RMS value:
- 931 -
Examples
- 932 -
11.26 Spectral Decomposition
Name
Description
Spectral Decomposition unravels the seismic signal into its constituent fre-
quencies, which allows the user to see phase and amplitude tuned to specific
wavelengths. The amplitude component excels at quantifying thickness variability
and detecting lateral discontinuities while the phase component detects lateral dis-
continuities.
It is a useful tool for "below resolution" seismic interpretation, sand thickness estim-
ation, and enhancing channel structures.
Input Parameters
l FFT the Fast Fourier Transform. The FFT requires a short window (time-gate) and a
step-size between the analyzed frequencies. This step can be interpreted as the fre-
quency resolution.
l CWT the Continuous Wavelet Transform. The CWT requires a wavelet type.
When choosing the CWT, you can set the wavelet type:
l Morlet
l Gaussian
l Mexican Hat
- 933 -
In FFT only, the signal within the time-window will be transformed into frequency
domain. The given step determines the output resolution, if necessary zeros will be
added to acquire this resolution. The amplitude spectrum is calculated for the
requested frequency. The time-window slides from top to bottom to cover the com-
plete signal. In an ideal situation, the time-window encompasses one seismic
event, which may be a superposition of multiple geological events which interfere
in the seismic trace.
The CWT is defined as the sum over the signal multiplied by a scaled and shifted
wavelet. The wavelet is shifted along the signal and at each position the cor-
relation of the wavelet with the signal is calculated. The result is called a wavelet
coefficient. The given frequency corresponds to the central wavelet frequency. The
step determines the output resolution, which is especially interesting when eval-
uating this attribute.
- 934 -
Spectral Decomposition (CWT) applied to a horizon. Notice that each index fre-
quency is describing the specific parts of channels (NNE-SSW oriented). Also thin-
ner and thicker parts of horizon along channels are highlighed clearly.
- 935 -
Time-frequency spectrum
The output frequency is best determined using the time-frequency spectrum panel.
This panel displays the spectral decomposition output for all frequencies between
0 and the Nyquist frequency of the data, computed with a step of 1Hz. One must
first select a position for this single trace analysis:
- 936 -
The time-frequency is then displayed in a 2D panel like this:
- 937 -
- 938 -
Color blended display:
A color blended map view (image on right) of the spectral decomposition (red-
10hz, green- 20Hz, blue- 40hz). Compare the results with the coherency map
(image on left). Note that the yellowish colored fault bounder region is thicker as
compared to the surrounding regions. The faults throw (red-color) are also clearly
observable. Coherency/similarity together with color blended spectral images can
reveal better geological information.
- 939 -
11.27 Texture
Name
Description
GLCM texture attributes come from image processing and were developed to cap-
ture roughness / smoothness of an image. The attribute response is calculated in
two steps: First the GLCM is computed for an area (volume) around the evaluation
point. Secondly a statistical property from the GLCM is returned. The GLCM is a
2D matrix that captures how often the neighbor values A and B occur in an image.
Look at the GLCM as a matrix of N x N dimensions that captures the amplitude
response of the reference position in the columns and the amplitudes of the neigh-
boring position in the rows. N is the range of all values the data can have. Let's say
we have a data set in which amplitudes can have values 0,1,2,or 3 (GLCM matrix
is 4 x 4). We then fill the matrix by comparing each amplitude in the input area
(volume) with its direct neighbor and increase the occurrence of the corresponding
matrix cell. The matrix is made symmetrical by comparing neighbors in both dir-
ections: reference vs neighbor and neighbor vs reference, and it is normalized by
dividing through the total number of occurrences. The normalized GLCM matrix is
a kind of a probability matrix that tells us how probable it is to find pairs of neigh-
boring amplitudes in the area (volume) around the evaluation point.
In OpendTect the GLCM is computed on re-scaled data. The input data is re-
scaled linearly to 4-bits (values ranging from 0 to 15; GLCM 16 x 16), or to 5-bits
(values from 0 to 31; GLCM matrix 32 x 32). To re-scale the data the user must give
the clipping range of the input data. Neighbors are compared in the inline and
cross-line directions. The matrix is further filled by looping over the user-defined
time-gate. Note that when dip-steering is used the input extraction area (volume)
follows the local stratigraphy, which leads to better responses in dipping strata.
- 940 -
This can be done in a supervised approach (MLP network), or in an unsupervised
approach (UVQ network).
Input Parameters
GLCM-size is the size of the GLCM matrix. 32 x 32 may give somewhat sharper
outputs at the expense of more CPU time.
Input Data Minimum and Maximum define the clipping range of the data, which is
needed to rescale the data to 4-bits (16 x 16), or 5-bits (32 x 32).Input data range is
automatically calculated from the compute option.In the "analysis" window number
of traces need to be selected which computes the scaling range from the selected
input traces.
Output
1. Contrast group: Measures related to contrast use weights related to the distance from
the GLCM diagonal along which neighboring values are equal. Attributes in this
group: Contrast, Dissimilarity, Homogeneity
- 941 -
2. Measures related to orderliness. Attributes in this group: Angular Second Moment
(ASM), Energy, Entropy
3. Group using descriptive statistics of the GLCM texture measures. Attributes in this
group: GLCM Mean, GLCM Variance, GLCM Standard Deviation, GLCM Correlation
In all equations given below N denotes the size of the GLCM matrix; i refers to the
column and j to the row. P is the GLCM Probability matrix.
Contrast
When i and j are equal, the cell is on the diagonal and (i-j)=0. These values rep-
resent amplitudes entirely similar to their neighbor, so they are given a weight of 0.
Dissimilarity
In Dissimilarity the weights with which GLCM probablities are multiplied increase
linearly away from the diagonal (along which neighboring values are equal).
Homogeneity
- 942 -
Dissimilarity and Contrast result in larger numbers for more contrasting windows. If
weights decrease away from the diagonal, the result will be larger for input areas
(volumes) with little contrast. Homogeneity weights values by the inverse of the
Contrast weight, with weights decreasing exponentially away from the diagonal.
ASM and Energy use the GLCM probability as a weight for itself. The name for
ASM comes from Physics, and reflects the similar form of Physics equations used
to calculate the angular second moment, a measure of rotational acceleration.
High values of ASM or Energy occur when the input area (volume) is very orderly.
Energy
See above.
Entropy
- 943 -
equation used to calculate physical entropy is very similar to the one used for the
texture measure.
GLCM Mean
The left hand equation calculates the mean based on the reference pixels, i. The
right-hand equation calculates the mean over the the neighbor pixels, j. These two
values are identical because OpendTect computes a symmetrical GLCM, where
each amplitude is counted once as a reference and once as a neighbor.
GLCM Variance
Variance is a measure of the dispersion of the values around the mean. It is similar
to entropy. It answers the question "What is the dispersion of the difference
between the reference and the neighbour pixels in this input area (volume)?"
GLCM Variance in texture measures performs the same task as does the common
descriptive statistic called variance. It relies on the mean, and the dispersion
around the mean, of cell values within the GLCM. However, GLCM variance uses
the GLCM, therefore it deals specifically with the dispersion around the mean of
combinations of reference and neighbor amplitudes, so it is not the same as vari-
ance of input amplitudes that can be computed with the "Volume Statistics" attrib-
ute.
Variance calculated using i or j gives the same result, since the GLCM is sym-
metrical.
- 944 -
There is no particular advantage to using Standard Deviation over Variance, other
than a different range of values.
GLCM Correlation
GLCM Correlation is quite a different calculation from the other texture measures
described above. As a result, it is independent of them (gives different information)
and can often be used profitably in combination with another texture measure. It
also has a more intuitive meaning to the actual calculated values: 0 is uncor-
related, 1 is perfectly correlated.
GLCM Correlation can be calculated for successively larger window sizes. The
window size at which the GLCM Correlation value declines suddenly may be
taken as one definition of the size of definable objects within an image.
If the input is completely uniform the GLCM variance is 0 and the correlation func-
tion is undefined. OpendTect will in that case return the value 1.
Examples
- 945 -
Seismic
Contrast: 3 x 3 x [- Dissimilarity: 3 x 3
8,8], DS x [-8,8], DS
- 946 -
Entropy: 3 x 3 x [-8,8]
GLCM Mean: 3 x 3 x GLCM Variance: 3
[-8,8], DS x 3 x [-8,8], DS
References
- 947 -
l Chopra, S. and Alexeev, V., 2005. Application of texture attribute analysis to 3D seis-
mic data. CSEG Recorder, Sep. 2005 pp 29-32.
l Hall-Beyer, M. GLCM Texture Tutorial. Available: Online [Accessed 9 Oct. 2012].
- 948 -
11.28 Texture - Directional
Name
Description
Texture- Directional uses the grey level co- occurrence matrix (GLCM) and its
derived attributes are tools for image classification that were initially described by
Haralick et al. (1973). Principally, the GLCM is a measure of how often different
combinations of pixel brightness values occur in an image. It is a method widely
used in image classification of satellite images (e.g. Franklin et al., 2001; Tsai et
al., 2007), sea-ice images (e.g. Soh and Tsatsoulis, 1999; Maillard et al., 2005),
magnetic resonance and computed tomography images (e.g. Kovalev et al., 2001;
Zizzari et al., 2011), and many others. Most of these GLCM applications are for
classification of 2D images solely. The application of GLCM for seismic data has
been a minor topic in comparison to common seismic attributes such as coher-
ence, curvature or spectral decomposition. Today, a high percentage of the avail-
able seismic data is 3D seismic. Therefore, it is important for the classification of
seismic data to adapt the GLCM calculation to work in the three- dimensional
space. Few authors have described the application of GLCM for 3D seismic data
with various approaches to this topic (Vinther et al., 1996; Gao, 1999, 2003, 2007,
2008a, 2008b, 2009, 2011; West et al., 2002; Chopra and Alexeev, 2005, 2006a,
2006b; Yenugu et al., 2010; de Matos et al., 2011; Eichkitz et al. 2012, 2013,
2014).
- 949 -
- 950 -
Figure 1: Example for the calculation of grey level co-occurrence matrix-based
attributes using eight grey levels for a randomly generated 2D grey-scale image
(a). The grey-scales of the image can be represented by discrete values (b). The
number of co-occurrences of pixel pairs for a given search window are counted
and a grey level co-occurrence matrix (c) is produced. Based on this co-occurrence
matrix, several attributes can be calculated. In this example, the grey level co-occur-
rence matrices are determined for the horizontal (d), the vertical (e), the 45° diag-
onal (f), the 135° diagonal (g), and for all directions at once (h). The first step in
calculation is the determination of co-occurrences (column 2). Zero entries are
marked in light grey and the highest value of each matrix is marked in dark grey. It
is evident that calculations in single directions lead to sparse matrices. The GLCM
is normalized by the sum of the elements to get a kind of probability matrix (column
3). Finally, the probabilities are used for the calculation of GLCM-based attributes.
In column 4 the results for Entropy, Contrast, Homogeneity, Entropy, and Cluster
Tendency are shown.
In the case of 3D data the number of possible directions increases to 13. In Figure
2 a simple Rubik’s cube is taken to explain the 13 possible directions for a 3D data-
set. This Rubik’s cube is build-up of 27 small cubes. The small cube in the center
(the turning point in a Rubik’s cube) is the point of interest for which the cal-
culations are performed. This center point is surrounded by 26 neighboring cubes.
If we now take the center point and draw lines form it to all neighboring cubes, we
get 13 directions on which the neighboring samples are placed.
- 951 -
- 952 -
Figure 2: The number of principal neighbors for one sample point can be best
explained by looking at a Rubik’s cube (a). The center of the Rubik’s cube (core
mechanism for rotating the cube, red box in (b)) has in total 26 neighboring boxes
(including diagonal neighbors). These boxes are aligned in 13 possible directions.
Analogous to this, a sample point within a seismic sub-volume has 26 neighbors
aligned in 13 directions (c). In the developed workflow it is possible to calculate the
GLCM along single directions, along combinations of directions (e.g. inline dir-
ection, crossline direction, …), or all directions can be calculated at once (after
Eichkitz et al., 2013).
Input Parameters
Input Data
The input for the GLCM-based attribute calculation can be any seismic amplitude
3D cube/2D section. In the process of GLCM calculation this amplitude cube is con-
verted to a grey level cube.
For the transformation of the amplitude cube to a grey level cube the range of the
amplitude values is needed. This can either be inserted manually, or be computed.
In the case of computed amplitude range, the amplitude range will be symmetrical
around zero.
The number of grey levels used for the transformation of amplitude cube to grey
level cube. Higher numbers generally improve the quality of the GLCM output.
Common numbers for the grey levels are 16 to 256.
Number of Traces
The number of traces defines the horizontal analysis window. This horizontal ana-
lysis window is always symmetrical around the center trace. Number of traces
equal 1 means 1 trace left and right of the center trace (thus 3 traces).
The vertical search window defines the number of samples included in the search
window. The vertical size of the analysis window should be according to the aver-
- 953 -
age wavelength (Gao, 2007). This is typically in the range of 15 samples (+/- 7
samples).
GLCM Attribute
Direction of Calculation
- 954 -
Figure 3: Definition of directions.
Steering
GLCM attribute calculation can be done with or without steering. The integration of
dip steering generally improves the signal-to-noise ratio in calculated attributes.
- 955 -
Figure 4: Texture attribute window within OpendTect.
Mathematical description
- 956 -
In this equation G(x,y) are the center sample points and G(x+dx, y+dy) are the
neighboring sample points. Usually, the distance between center and neighboring
samples is one, but in general greater distances could also be taken for the cal-
culation. It is, in principal also possible to combine the four principal directions to
form an average GLCM. By this approach the spatial variations can be eliminated
to a certain degree (Gao 2007). In the case of 3D data the number of possible dir-
ections increases to 13. The 3D case implies a modification of the above given
equation:
The second attribute group is the orderliness group, which includes attributes such
as energy and entropy. Attributes in the orderliness group measure how regular
grey level values are distributed within a given search window. In contrast to the
first group all attributes from this group are solely a function of the GLCM prob-
ability entries.
- 957 -
The third attribute group is the statistics group, which includes attributes such as
Haralick et al.’s (1973) measure of mean and variance. These are common mean
and variance calculations applied onto the GLCM probabilities.
- 958 -
- 959 -
- 960 -
Examples
References
l Chopra, S., and V. Alexeev, 2005, Application of texture attribute analysis to 3D seis-
mic data: 75th SEG meeting, Houston, Texas, USA, Expanded Abstracts, 767-770.
l Chopra, S., and V. Alexeev, 2006a, Application of texture attribute analysis to 3D seis-
mic data: The Leading Edge, 25, no. 8, 934-940.
l Chopra S., and V. Alexeev, 2006b, Texture attribute application to 3D seismic data:
6th International Conference & Exposition on Petroleum Geophysics, Kolkata, India,
Expanded Abstracts, 874-879.
l de Matos, M.C., Yenugu, M., Angelo, S.M., and K.J. Marfurt, 2011, Integrated seismic
texture segmentation and cluster analysis applied to channel delineation and chert
reservoir characterization: Geophysics, 76, no. 5, P11-P21.
l Eichkitz, C.G., Amtmann, J. and M.G. Schreilechner, 2013, Calculation of grey level
co-occurrence matrix-based seismic attributes in three dimensions: Computers and
Geosciences, 60, 176-183.
l Eichkitz, C.G., de Groot, P. and Brouwer, F., 2014. Visualizing anisotropy in seismic
facies using stratigraphically constrained, multi-directional texture attribute analysis.
AAPG Hedberg Research Conference “Interpretation Visualization in the Petroleum
Industry”, Houston, USA.
- 961 -
l Franklin, S.E., Maudie, A.J., and M.B. Lavigne, 2001, Using spatial co-occurrence tex-
ture to increase forest structure and species composition classification accuracy: Pho-
togrammetric Engineering & Remote Sensing, 67, no. 7, 849-855.
l Gao, D., 1999, 3-D VCM seismic textures: A new technology to quantify seismic inter-
pretation: 69th SEG meeting, Houston, Texas, USA, Expanded Abstracts, 1037-1039.
l Gao, D., 2003, Volume texture extraction for 3D seismic visualization and inter-
pretation: Geophysics, 68, no. 4, 1294-1302.
l Gao, D., 2007, Application of three-dimensional seismic texture analysis with special
reference to deep-marine facies discrimination and interpretation: Offshore Angola,
West Africa: AAPG Bulletin, 91, no. 12, 1665-1683.
l Gao, D., 2008a, Adaptive seismic texture model regression for subsurface char-
acterization: Oil & Gas Review, 6, no. 11, 83-86.
l Gao, D., 2008b, Application of seismic texture model regression to seismic facies
characterization and interpretation: The Leading Edge, 27, no. 3, 394-397.
l Gao, D., 2009, 3D seismic volume visualization and interpretation: An integrated work-
flow with case studies: Geophysics, 74, no. 1, W1-W24.
l Gao, D., 2011, Latest developments in seismic texture analysis for subsurface struc-
ture, facies, and reservoir characterization: A review: Geophysics, 76, no. 2, W1-W13.
l Haralick, R.M., Shanmugam, and K. Dinstein, I., 1973, Textural features for image
classification: IEEE Transactions on systems, man, and cybernetics, 3, no. 6, 610-
621.
l Kovalev, V.A., Kruggel, F., Gertz, H.-J., and D.Y. von Cramon, 2001, Three-dimen-
sional texture analysis of MRI brain datasets: IEEE Transactions on medical imaging,
20, no. 5, 424-433.
l Maillard, P., Clausi, D.A., and H. Deng, 2005, Operational map-guided classification
of SAR sea ice imagery: IEEE Transaction on Geoscience and Remote Sensing, 43,
no. 12, 2940-2951.
l Soh, L.-K., and C. Tsatsoulis, 1999, Texture analysis of SAR sea ice imagery using
gray level co-occurrence matrices: IEEE Transactions on Geoscience and Remote
Sensing, 37, no. 2, 780-795.
l Tsai, F., Chang, C.-T., Rau, J.-Y., Lin, T.-H., and G.-R. Liu,, 2007, 3D computation of
gray level co-occurrence in hyperspectral image cubes: LNCS 4679, 429-440.
l Vinther, R., Mosegaar, K., Kierkegaard, K., Abatzi, I., Andersen, C., Vejbaek, O.V., If,
F., and P.H. Nielsen, 1996, Seismic texture classification: A computer-aided
approach to stratigraphic analysis: 65th SEG meeting, Houston, Texas, USA, 153-
155.
l Wang, H., Guo, X.-H., Jia, Z.-W., Li, H.-K., Liang, Z.-G., Li, K.-C., and Q. He, 2010,
Multilevel binomial logistic prediction model for malignant pulmonary nodules based
on texture features of CT image: European Journal of Radiology, 74, 124-129.
l West, B.P., May, S.R., Eastwood, J.E., and C. Rossen, 2002, Interactive seismic
facies classification using textural attributes and neural networks: The Leading Edge,
21, no. 10, 1042-1049.
l Yenugu, M., Marfurt, K.J., and S. Matson, 2010, Seismic texture analysis for reservoir
prediction and characterization: The Leading Edge, 29, no. 9, 1116-11.
- 962 -
l Zizzari, A., Seiffert, U., Michaelis, B., Gademann, G., and S. Swiderski, 2001, Detec-
tion of tumor in digital images of the brain: Proc. of the IASTED international con-
ference Signal Processing, Pattern Recognition & Applications, Rhodes, Greece,
132-137.
- 963 -
11.29 Volume Statistics
Name
Description
This attribute extracts data from a cube or line using a small 3D probe and returns
a statistical property from the samples collected.
Input Parameters
- 964 -
The probe is defined using relative times/depths and trace stepouts with respect to
the actual location. The probe shape can be set by the shape field. Supported
probes are:
l A [0,0] time gate: The probe becomes a flat rectangle or disk. Only one sample per
trace is used at the actual time/depth.
l A 0 stepout: Only one sample is used in one or both directions.
Output
l Average
l Median
l Min, Max, Sum
l Variance
l Norm Variance
l Most Frequent
l RMS: Root Mean Square
Undefined values collected by the probe will not be used for the computation. In
the case of a stack only the valid values will define the number of values.
- 965 -
12 Appendix B - Command
Driver Manual
l Introduction
l Command Driver control window
l Execution from the command line
l Window management
l Search keys
l Identifiers and expressions
l Command specifications
l Expressional specifications
l Repetitive task example
l Standard test scripts
l User history recording
Introduction
Around 2006, it became clear that OpendTect was growing so fast that extensive
testing of every part was becoming a real big task. This is a well known issue in
the Agile development literature. The usual approach is to implement automated
testing.
When designing the implementation, the main issues we had to cope with were:
l A dynamic user interface that is defined by relations between fields. Therefore, the
exact layout of windows can change considerably by small changes
l Small changes or fixes in algorithms can produce slightly different outputs for the
same input
l A test system should be programmable, not just be a replay of previously run tasks
- 966 -
Examples:
Include "$SCRIPTSDIR$/EvalutionEnergyAttrib.odcmd"
Almost immediately we noticed that finding out the names of the fields was very
hard without some tool. Therefore, we made the 'Tooltip name guide', which will
put OpendTect in a state where the name of a field is shown as a tool tip (over-
riding any existing tool tip). When it became clear that our automated test facility
would be used as a scripting system for power users, we implemented a tool to
record actions.
Provided that the CmdDriver plugin has been (auto)-loaded, one can access the
Command Driver control window in two ways. Firstly, it can be launched from the
menu bar of OpendTect Main Window:
Secondly, this so-called Command Controller also pops up when pressing Ctrl-R
with the mouse pointer inside any OpendTect window or dialog. This is particularly
useful if the Command Driver menu bar is greyed out or menu access is blocked
after a modal dialog has been popped up. Only a few Qt-borrowed QDialogs like
the QMessageBox, QFileDialog and QColorDialog will nevertheless prevent any
user input to the Command Controller, and have to be clicked away first.
- 967 -
"Command Controller (Ctrl-R)"-window in run-mode and record-mode respectively.
- 968 -
The Command Controller has a combobox to switch between two appearances.
One for running and one for recording a command file. The TooltipNameGuide
checkbox can be (un)checked at both appearances. It sets a tooltip mode that dis-
plays the hidden name of any uiObject pointed to. Tooltip names are displayed
double-quoted on a cyan background. Go to the Survey->Select/Setup window for
a small demo-example. Click the '(X,Y) <=> I/C' -button and unveil the hidden
names of buttons and input fields with your mouse pointer. OpendTect (plugin)
developers might have a look in the file $WORK/src/uiIo/uiconvpos.cc to see how
these names are annotated in the code.
Running the Command Driver requires selecting an input command file with the
extension '.odcmd', or the obsolete '.cmd'. Filenames without a full path are defined
relative to the 'Proc'-directory of the current survey. The 'Edit'-button launches a
simple text editor to view and edit the input command file directly from OpendTect.
One can also specify an output log file to have the Command Driver write its pro-
gress and error messages. The 'Examine'-button launches a scroll window to view
this log file even while running. The Command Driver starts running when pushing
the 'Go'-button, after which 'Pause' and 'Abort'-buttons allow a temporary stop or
premature termination.
The Command Recorder is able to record a sequence of user interface actions per-
formed by the user. One has to specify an output command file with the extension
'.odcmd'. Filenames without a full path are defined relative to the 'Proc'-directory of
the current survey. The 'Examine'-button launches a scroll window to view which
commands are written during recording. Recording starts after pushing the 'Start'-
button. Next perform a sequence of user actions and push the 'Finish'-button. The
Command Controller will automatically switch to run-mode afterwards, so that one
can play back the recorded script in order to verify its correct operation. View the
log file for eventual errors. Finally, the recorded script can be edited to avoid these
errors, improve robustness and general applicability, and insert auxiliary com-
mands.
One does not necessarily have to run a command script from OpendTect's user
interface. One or more scripts can also run straight from the command line, and
with that they might be called from shell scripts as well. An example of the com-
mand line syntax on Linux:
- 969 -
This line opens a new instance of OpendTect and starts running the given scripts
in there. OpendTect will not automatically close when all scripts are finished. If
necessary, this must be done from the last script using the 'Close All' command.
Note that command line execution is only possible if your OpendTect auto-loads
the Command Driver plugin at startup.
The user can also create command scripts named autoexec.odcmd . If avail-
able, these scripts will be run at start-up before any script specified on the com-
mand line. The first autoexec-script to be run is searched in the settings directory.
This is the .od subdirectory of your personal home directory, unless the DTECT_
SETTINGS or DTECT_WINSETTINGS environment variable specifies otherwise.
The second autoexec-script to be run is searched in the Proc directory of the start-
up survey. Next, the command line scripts will be run in order. Moreover, every
time a new survey is selected in OpendTect, the Proc directory of this new survey
will also be searched for an autoexec-script to be run at that moment.
Apart from the availability of autoexec-scripts, their execution can always be dis-
abled by adding - - noautoexec , - - nosettingsautoexec or - - nos-
urveyautoexec as an extra argument on the command line. Command line
argument --cmdlog /full_path/my_odcmdlog.txt will override the loc-
ation of the output log file. Its default name is odcmdlog.txt and it is located in
the Proc directory of the start-up survey, if writable, or in the personal user dir-
ectory otherwise.
To run a command file other than autoexec.odcmd on Windows, one has to cre-
ate a DOS batch file. That file must contain a command line similar to this:
Go to your OpendTect installation folder to ascertain the right name and location of
the OpendTect executable (i.e. which disk, bin-path, win32 / win64, and '.exe'-file
name?). Executing your DOS batch file will open a new instance of OpendTect
and run the given script at startup.
Window management
Commands always apply to the current window. This is OpendTect Main Window
when starting the driver. Any modal window popping up will automatically become
the current window. Modeless windows that pop up can be appointed current win-
dow by using the Window-command. In case a modal current window closes, its
- 970 -
parent will become the current window. If a modeless current window closes, the
latest former current window still existing will be restored as current window.
Neither can the command driver manipulate the contents of windows popped up by
batch programs that were launched from OpendTect. An example of this is the
multi-machines processing window. Except for manual intervention, there is no
workaround yet.
The benefit of window assertions is in error recovery. If OpendTect does not pop
up a window that was expected by the command file or does pop up a window that
was not expected by the command file, the Command Driver is able to proceed
under guidance of these window assertions. This already solves a lot of cases in
which the flow of OpendTect happens to be mildly data or environment dependent.
For example, if an error message like "File already exist! Overwrite?" appears but
is not handled in the command file (or the other way around), the Command Driver
can continue as it was most probably intended by the user: close the window and
write the file. Another example is any command to switch on/off a button or menu
item. If it happens to be switched on/off already, window assertions enable the
Command Driver to skip all commands following from that switch as well.
- 971 -
Although not compulsory, it is advised to make systematical use of window asser-
tions. It also improves the readability of your command file. See any file generated
by the Command Recorder to get an idea.
When using window assertions in combination with control flow commands, the
most easy-going and efficacious method is having each individual command (If,
ElseIf, Then, Fi, For, Rof, DoWhile, Od, Do, OdUntil) both preceded and suc-
ceeded by the window assertion in force. Wrongly omitting a window assertion
around a control flow command will not result in an assertion violation error, but it
does severely reduce the possibilities of the Command Driver to recover in case of
other errors.
- 972 -
Search keys
Many commands supported by the Command Driver use search keys to address
window names, names of buttons, input fields and other GUI-elements, item names
of menus, lists, trees and tables, etc. The matching of these search keys is strict
where necessary and accommodating where practical. The matching process hon-
ors the following principles:
The asterisk '*' is used as a wildcard character. It may appear any number of times
in the search key to match an arbitrary substring.
Any sequence of white space characters like spaces, tabs, newlines, etc. will be
removed at the beginning and at the end of both the match name and search key,
and compressed into one single space when found in between.
The ellipses found at the end of menu item names and button texts are ignored.
Any dot after the last non-white space character of the match name and search key
is removed.
Underscores denoting keyboard shortcuts in menu item names, tab item names
and button texts are ignored. The ampersands denoting these shortcuts in Qt-inter-
face programming code should not appear in your search key, unless you want to
match an actual ampersand of course.
One example to demonstrate the use of escape characters. The table column
header "* [All lines]" will be matched exactly by the search key "@* [All lines]". The
asterisk must be escaped to prevent it from being interpreted as a wildcard for post-
fix matching, while the brackets do only need escapes when occurring inside win-
dow assertions.
Any pair of bracket-like characters enclosing a name to match are ignored, unless
the search key does explicitly specify them. Outer character pairs that may be
- 973 -
stripped include "...", [...], <...>, and >...<. This matching protocol is recursive. If the
match name and search key do not match, eventual outer brackets of the match
name are removed, and the whole matching process is repeated.
The Case-command allows the user to determine whether upper-case and lower-
case alphabetical characters will match each other. The default is case-insensitive.
Command scripts may contain identifiers to store data and (re)use its values. Every
identifier consists of a letter followed by any sequence of letters, digits and under-
scores ('_'). Identifiers are treated case insensitive. Identifiers can represent con-
stants, script variables, environment variables, built-in function names, or user-
defined procedure names.
The Command Driver has only one internal data type. All identifier values are rep-
resented as character strings. The string can be a boolean value, an integer value
(even octal or hexadecimal), a fixed or floating point value, one single character or
indeed a whole character string. The operator or built-in function using the iden-
tifier will define into which data type the string has to be converted, and will pro-
duce an error if the conversion fails. User-defined procedures should do the same.
Boolean values are actually mapped onto numbers, where 0 is false and 1 is true,
and any other number is also interpreted as true.
Expressions are assembled from identifiers, numbers, string constants ( "..." ), par-
entheses, built-in functions, and about twenty mathematical and logical operators
(see Expressional specifications). The assign command ( <ident> = <expr> ) stores
the evaluated result of an expression into an identifier.
- 974 -
In order to keep the parsing of command actions tractable, few commands will dir-
ectly accept expressions as argument(s). Apart from the assign command, the only
ones are control flow commands: If, ElseIf, For, DoWhile, OdUntil, Return, and any
user-defined procedure call with value parameters. If you want to use an expres-
sion in an arbitrary command action, assign the expression result to an identifier
and substitute its value.
Identifier substitution can also be used to simulate array variables (e.g. array
[index]). The Command Driver does not explicitly support them, but index sub-
stitution can be applied for that purpose: array_$index$.
The scope of any identifier used in the body of a command script is global. The
scope of any identifier used inside a user-defined procedure is local by default,
and may shadow a global identifier with the same name. Attach the scope operator
'@' in front of an identifier to force access to its global namesake. That is all for
(re)assigning an identifier or (re)defining a procedure. Two extra scope rules apply
for reading an identifier or calling a procedure. If an unscoped identifier value is loc-
ally undefined, the search will continue at the global scope level directly. If an
unscoped procedure call is locally undefined, the search will continue at the pre-
vious scope level, and recursively descend to the global level as long as no defin-
ition is found.
Command specifications
Any text editor can be used to produce or modify OpendTect scripts. Every com-
mand file has to start with the following four-lines header. The correctness of ver-
sion number, date and time is not vital for a successful execution of the command
file.
dTect V4.0
OpendTect commands
- 975 -
Mon Apr 20 09:20:09 2009
The Command Driver expects one command per line and one line per command
by default. Multiple commands on one line must be separated by a semicolon (';').
Long commands exceeding the width of one line may be broken by adding a back-
slash ('\') as last non-white character on the line. Never put a break before white
space, since preceding white space on the next line will be considered indent-
ation. Instead of using a backslash break, one can also leave the Enter- key
unpressed at all and simply have the line run through the right margin. Empty lines
and (commentary) lines starting with a '#'-symbol are allowed and ignored.
One can make a quick start by having the first command file generated by the Com-
mand Recorder, and start editing from there.
- 976 -
<searchstr> = (<wild- the new current window. This
card>?<textstr>)*<wildcard>? command is needed to access
modeless windows, since only
<wildcard> = '*' modal windows that pop up will
automatically become the current
<disambiguator> = '#'<selnr> window.
- 977 -
<sep> = '`' may contain one or more wild-
cards to match arbitrary sub-
<itemname> = strings.
<searchstr><disambiguator>?
If a (wildcarded) item name
matches more than once in a
(sub)menu or any other list, a
number can be attached to dis-
ambiguate the search. A negative
number will count the matching
items in reverse. To give a few
exotic examples: the menu path
"Survey`*#- 1" would select the
last item of the Survey- menu,
while the path "Survey`#1" would
refer to its first empty-named item.
Button "<keystr>" <onofftag>? Press a (push/radio/check/tool)-
button by providing one or more
<keystr> = <objname>( <sep><objname> search keys. Multiple search keys
)*<disambiguator>? are not allowed for QDialog win-
dows. Checkable buttons are
<objname> = <searchstr> toggled, unless the optional
On/Off- argument specifies their
desired state. The best search
key is of course the button name,
but some objects do not have a
(visible) name and lots of times
names are not unique.
- 978 -
If the user is unable to specify a
set of keys that leaves exactly
one button, as a last resort one of
the remaining buttons can be
selected by attaching a dis-
ambiguation number after the last
key. Beware that the order of
those buttons will be fixed, but
unlike menus not always in a left-
right top-down fashion. The but-
tons will be counted in reverse in
case the disambiguator is neg-
ative.
ButtonMenu "<keystr>" "<menupath>" Selects an item from the menu
attached to a button. Some tool-
<onofftag>? buttons contain a left- clickable
arrow part to have such a menu
appeared. Checkable menu
items are toggled, unless the
optional On/Off- argument spe-
cifies their desired state. Selec-
tion of the button is analogous to
the button selection described
above. Selection of the menu
(sub)item is analogous to the
Menu-command.
Input "<keystr>" <inpdata>? <entertag>? Inputs a string or number into one
of the fields of the current win-
<entertag> = Enter | Hold dow. The field selection is ana-
logous to the button selection
<inpdata> = "<inputstr>" | <number> | described above. The Hold-
FilePath "<filepathstr>" option only triggers actions
defined on changing the input
field, while the Enter-option also
triggers actions defined on press-
ing the enter/return- key after-
wards.
- 979 -
tents of the selected field will be
entered. The FilePath- option
forces the input string to be
treated as a file-path concerning
platform independence. This will
happen automatically in case the
input string contains any directory
identifier substitute ( $...DIR$ ), or
in case the input field comes
together with a Select-button and
perhaps an Examine-button into
a graphical element called
uiFileInput.
Spin "<keystr>" <spinsteps> <entertag>? Clicks a spinbox any number of
steps upward (positive) or down-
<spinsteps> = <posint> | <negint> ward (negative). The spinbox
selection is analogous to the but-
ton selection described above.
The Hold- option only triggers
actions defined on changing the
spinbox value (after each step),
while the Enter- option also trig-
gers actions defined on the spin-
box losing its focus afterwards.
- 980 -
resented by the slider is not
necessarily linear. It is optional to
perform the shift in more than one
step, but it may yield nice anim-
ations on screen.
- 981 -
selection described above. The
Input- command may be applied
to edit the input field of the current
item of an editable combobox.
ListClick "< keystr >" < itemsel > Clicks and (de)selects precisely
<mousetag>? one item in a listbox. Selection
possibilities for listbox and item
<mousetag> = <ctrl- are analogous to combobox
click>?<doubleclick>?<leftrightclick>? selection. The optional mousetag
is defining whether the item is
<ctrlclick> = Ctrl (de)selected by means of
optional control- or double-click-
<doubleclick> = Double ing of either the left or right
mouse button. Left is default in all
<leftrightclick> = Left | Right cases. Beware that the mousetag
is one united word and order
counts.
- 982 -
Beware that mousetags with
Double or Right are known not to
have a (lasting) effect on the
check-button.
ListMenu "< keystr >" < itemsel > Selects a (sub)item from the
<mousetag>? "<menupath>" <onofftag>? menu attached to a listbox item.
Checkable menu items are
toggled, unless the optional
On/Off- argument specifies their
desired state. The selection of
both listbox and item is ana-
logous to the ListClick-command.
However, since OpendTect is nor-
mally hiding its popup menus
under the right mouse button, the
default for no mousetag at all is
set Right over here. Selection of
the menu (sub)item is analogous
to the Menu-command.
ListSelect "< keystr >" <firstitemsel> Selects any number of items in a
<lastitemsel>? <onofftag>? multi- selection listbox. Selection
possibilities are more com-
<firstitemsel> = <lastitemsel> = <itemsel> prehensive than those at the
ListClick- command. Now all
items matching a given (wild-
carded) item name can be spe-
cified. One can also specify the
whole range between a first and
a last item at once. The list will be
traversed cyclically in case the
first item succeeds the last.
Without the optional On/Off-argu-
ment, all specified items will be
selected and all other items
deselected. With the On/Off-argu-
ment set however, only specified
items will be selected/deselected
respectively, while unspecified
items keep their current state.
- 983 -
TableClick "< keystr >" <tableitemsel> Clicks either a row- head, col-
<mousetag>? head or cell in a table, and select
precisely the one row, column or
<tableitemsel> = <headitemsel> | <cell- cell attached to it. Selection pos-
sel> sibilities for the RowHead-,
ColHead- and Cell- options are
<headitemsel> = <headtag> <itemsel> analogous to the item selection
for list- and comboboxes. The
<headtag> = RowHead | ColHead Cell- option puts all cells row-
after-row in a virtual list for this
<cellsel> = <rowsel> <colsel> | Cell <item- purpose. Another way to address
sel> a single cell is by selecting both
its row and its column. Also these
<rowsel> = <colsel> = <itemsel> row and column selections are
analogous to the item selection
just mentioned.
- 984 -
analogous to the TableClick-com-
mand, except that table headers
cannot be filled.
TableMenu "< keystr >" < cellsel > Selects a (sub)item from the
<mousetag>? "<menupath>" <onofftag>? menu attached to a table cell.
Checkable menu items are
toggled, unless the optional
On/Off- argument specifies their
desired state. The selection of
both table and cell is analogous
to the TableFill-command. Selec-
tion of the menu (sub)item is ana-
logous to the Menu-command.
TableExec "<keystr>" <cellsel> <action> Executes a local command driver
action within one cell of a table
(instead of within the current win-
dow). The selection of both table
and cell is analogous to the
TableFill- command. Only those
commands accepting a keystring
argument might be appropriate
actions to execute within a cell.
For example, if the top-left cell of
a table contains a single com-
bobox, its selection can be made
as follows:
- 985 -
<headitemrangesel> = < headtag > TableClick- command. Now all
<firstitemsel> <lastitemsel>? row- heads, col- heads or cells
matching a given (wildcarded)
<cellrangesel> = <firstcellsel> <last- item name can be specified.
cellsel>?
One can also specify the whole
<firstcellsel> = <lastcellsel> = <cellsel> block between a first and a last
row, column, or cell at once. The
table will be traversed cyclically
in case the first (cell) row or (cell)
column succeeeds the last.
Without the optional On/Off-argu-
ment, all specified cells will be
selected and all other cells
deselected. With the On/Off-argu-
ment set however, only specified
cells will be selected/deselected
respectively, while unspecified
cells keep their current state.
Tab "<keystr>"? "<tabname>" Puts a tab on top of the stack by
name. Since windows with more
<tabname> = <itemname> than one tab-stack will be rare, its
selection is optional. The selec-
tion is analogous to the button
selection described above.
TreeClick "< keystr >"? <treenodesel> Clicks and selects precisely one
<mousetag>? node in a tree. The selection of a
tree node is analogous to the
<treenodesel> = "<treepath>" | PathCol selection of a menu (sub)item. In
"<treepath>" <colsel> which column to click is optional,
but it will be the first one by
<treepath> = <pathstr> default. This column selection is
almost analogous to the column
selection for tables. If the selec-
tion is not made by number but by
name, then the search for a
match will start in the column
header, and next proceed at the
selected tree node if not suc-
cessful.
- 986 -
Selection of the tree itself is ana-
logous to the button selection
described above. It is optional
because the current window will
often contain only one (data) tree.
The data tree with the lowest
scene number is guaranteed to
be the default for OpendTect
Main Window. The option to spe-
cify a left, right or double mouse-
click will be Left by default. Any
tree node might pop up a menu
when right-clicking on one of its
columns, in which case the
TreeMenu- command has to be
applied instead.
TreeExpand "< keystr >"? "< treepath >" (Un)expands the subtree of a
<onofftag>? node in a tree. The expander is
toggled unless the optional
On/Off- argument specifies its
desired state. The selection of
both tree and node is analogous
to the TreeClick- command,
except that column selection is
not an issue here.
TreeButton "< keystr >"? "< treepath >" Presses the button in front of a
<mousetag>? <onofftag>? node in a tree. The button is
toggled unless the optional
On/Off- argument specifies its
desired state. The selection of
both tree and node is analogous
to the TreeClick- command,
except that column selection is
not an issue here.
TreeMenu "< keystr >"? < treenodesel > Selects a (sub)item from the
<mousetag>? "<menupath>" <onofftag>? menu attached to a column of a
tree node. Checkable menu items
are toggled, unless the optional
On/Off- argument specifies their
desired state. The selection of
- 987 -
tree, node and column is ana-
logous to the TreeClick- com-
mand. Selection of the menu
(sub)item is analogous to the
Menu-command.
CanvasMenu "< keystr >" "< menupath >" Selects a (sub)item from the
<onofftag>? menu popping up at a canvas
area. Checkable menu items are
toggled, unless the optional
On/Off- argument specifies their
desired state. Selection of the
canvas area is analogous to the
button selection described above.
Selection of the menu (sub)item
is analogous to the Menu- com-
mand.
Ok These are special commands
that 'Ok' or 'Cancel' a dialog. Usu-
Cancel ally, this has the same effect as
pressing the Ok- or Cancel-but-
ton.
Close <closeoption>? Clicks on the Close-button in the
title bar of the current window.
<closeoption> = All | <subwinsel> The All- option will close all
OpendTect windows at once.
<subwinsel> = "<keystr>"? "<winname>" This option is compulsory in case
OpendTect Main Window is the
current window, so that
OpendTect cannot be killed by
accident. The optional sub-
window selection is available to
close a window in the workspace
of the current window by name.
Since windows with more than
one workspace will be rare, selec-
tion of the workspace is optional.
This selection is analogous to the
button selection described above.
Show <subwinsel>? <showtag> Clicks on the Minimized-, Max-
imized- and Restore- buttons in
- 988 -
<showtag> = Minimized | Maximized | the title bar of the current window.
Normal The optional subwindow selec-
tion is available to resize a win-
dow in the workspace of the
current window by name. Since
windows with more than one
workspace will be rare, selection
of the workspace is optional. This
selection is analogous to the but-
ton selection described above.
ColorOk <colorsel> Specifies the desired color while
closing a QColorDialog window.
<colorsel> = "<rgbtcolorstr>" | <color> One may specify a color either by
<transparency>? its RGB-values (0-255) or a color
tag. In case the QColorDialog
<color> = "<rgbcolorstr>" | <R> <G> <B> | offers the possibility to specify
<colortag> transparency, the value of the
optional t- channel (0- 255) is
<rgbtcolorstr> = <rgbstr>< sep ><trans- passed as well. Its default value
parency> is 0 (non-transparent). The RGB-
values and optional t- channel
<rgbcolorstr> = <R><sep><G><sep><B> can also be specified in one com-
posite RGB(t) color string.
<R> = <G> = <B> = <transparency> =
<byte>
- 989 -
The current directory of the
QFileDialog will be taken as
base directory in the latter case.
Also a set of directory identifiers
has been predefined for sub-
stitution ( $...DIR$ ) in file-paths.
The FileOk- command does not
provide the functionality of the
QFileDialog- button "Create new
folder". If a file- path specifies a
non- existing file or directory, its
parent directory must exist.
- 990 -
LogMode <logtag> Regulates the amount of warning and error mes-
sages in the log file. The All-option will show any
<logtag> = Basic | message generated. The Normal- option is the
Normal | All default. It shows all parsing messages, but action
messages are only shown if the action result is not
assigned to an identifier. The Basic-option will also
omit all warning messages.
Snapshot "<image- Writes a snapshot of the current window (and its
filepathstr>" environment) to file. The default grabbing area is
<frametag>? bounded by the CurWin-frame, but can optionally
be enlarged towards the ODMain- frame or the
<frametag> = CurWin | whole Desktop-frame. The snapshot filename must
ODMain | Desktop have one of the prescribed image extensions. The
file-path will be interpreted platform independently.
<imagefilepathstr> = Both absolute and relative file-paths are accepted.
<filepathstr><imageext> Also one of the predefined directory identifiers may
be substituted ( $...DIR$ ) in the file-path.
<imageext> = .bmp |
.jpg | .jpeg | .png |
.ppm | .xbm | .xpm
Sleep <seconds> Sleeps a period of time so that spectators can dis-
<sleeptag>? tinguish the consecutive steps from a command file
on screen. The Regular-option will sleep until fur-
<sleeptag> = Regular | ther notice between every two commands with a
Extra visual effect. The Extra-option is the default and will
take an (additional) nap only once.
Wait <seconds> <sleep- Tells the next command to wait a period of time only
tag>? if it is uncertain whether it has finished processing.
This can happen to any command closing a modal
dialog that was already open when the Command
Driver started. The Regular- option will allow this
waiting time to every command until further notice.
The Extra-option is the default and will allow an
(additional) wait only to the next command.
Pause "<textlines>"? Temporarily hold the execution of the command
script and have the Command Controller pop up a
<textlines> = <textstr>( message dialog with a 'Resume'-button so that the
<sep><textstr> )* user can decide when to continue. Specifying lines
of text is optional.
Guide "< textlines >" ( Temporarily hold the execution of the command
- 991 -
<guidetag> script and have the Command Controller pop up a
"<winname>" )? dialog requesting the user to take action. The text
lines describe which actions the user has to per-
<guidetag> = Existent | form. The Command Driver will automatically
Inexistent | Accessible resume if some window matching a given name is
| Inaccessible no longer (in)existent or (in)accessible. If this option
is not specified, the user gets a 'Done'- button to
have the Command Driver resume manually.
Comment "<textlines>" Inserts comment lines into the log file. Command
lines starting with a '#'-symbol are containing com-
ments that are not shown in log files.
<expr> = <ident> |
<number> |
"<textstr>" | '(' <expr>
')' | <functioncall> |
<operatorexpr>
<functioncall> = <fun-
cname>' (' <expres-
sions>? ')'
<funcname> =
- 992 -
<ident>
<expressions> =
<expr> ( ',' <expr> )*
If <expr><actions> Executes a number of command actions if a boolean
expression evaluates to true. The ElseIf- and Else-
( ElseIf < expr ><ac- branches are optional, but note that the terminating Fi-
tions> )* command is compulsory.
( Else<actions> )?
Fi
For < ident > '=' Initializes an identifier with the evaluated result of a
<expr> ( To <expr> numerical expression and repeats a number of com-
)? ( Step <expr> )? mand actions as long as the identifier value does not
exceed the evaluated result of the optional To-expres-
<actions> sion. After every loop iteration, the identifier is incre-
mented by the evaluated result of the optional Step-
Rof expression. The default step is 1, but even negative
values are allowed. Only the Break- command can
escape from a For-loop in absence of a To-expression.
Note that the terminating Rof-command is compulsory.
DoWhile <expr> Repeats the execution of a number of command
actions as long as a boolean expression evaluates to
<actions> true. Note that the terminating Od-command is com-
pulsory.
Od
Do Repeats the execution of a number of command
actions until a boolean expression evaluates to true.
<actions>
OdUntil <expr>
Break Escapes immediately from the innermost For-, While-,
or Until-loop.
Continue Skips the remaining actions in the current iteration of
the innermost For-, While-, or Until-loop.
Try <ident> <action> Tries to execute a command action and assigns its res-
ult to an identifier. The command syntax is available in
<ident>? '~' <action> both procedural style and operator style. The possible
outcomes are success (1), failure (0) and warning (-1),
- 993 -
for which the identifier constants SUCCESS,
FAILURE and WARNING have been predefined. The
operator style syntax is also usable without identifier
for the side-effect, because error messages will be tem-
porarily ignored.
Questioncmd Stores the answer from a question command into an
< questioncmd > identifier. The command syntax is available in both pro-
< ident > <ques- cedural style and operator style. All question com-
tionargs> mands implemented so far are listed below in Table IV
to VII.
< ident > '?' < ques-
tioncmd ><ques-
tionargs>
Def ( <returnpar> '?' Specifies a user-defined procedure in which a number
)? <procname>' (' of command actions are executed. Note that the ter-
<valpars>? ')' <var- minating Fed- command is compulsory. Nested pro-
par>* cedure definitions are allowed. A definition may occur
anywhere, as long as it precedes the first call to it. The
<actions> course of a procedure depends on an optional number
of value parameters (between the parentheses) and
Fed variable parameters (behind the parentheses).
- 994 -
<ident>* and variable parameters only accept identifiers, and
these might be modified. Notice that a procedure call
allows no space between the procedure name and its
opening parenthesis.
End Finishes the command stream immediately.
- 995 -
<formtag> = Text | defines whether disabled items are counted as well.
Number Selection of the menu (sub)item is analogous to the
Menu-command.
IsButtonOn "<keystr>" On (1) if the selected radio-, check-, or toolbar button
is checked, off (0) if it is unchecked, and unswitch-
able (-1) in case of a push button or a non-check-
able toolbar button. The identifier constants ON,
OFF and UNSWITCHABLE have been predefined
for convenience. The button selection is analogous
to the Button-command.
GetButton "< keystr >" Returns the text or color of a (push/radio/check/tool)-
<buttonformtag>? button. In which form is optional and 'Text' by
default. The RGBt color string format returned in the
<buttonformtag> = Text 'Color' case is defined at the ColorOk-command. If
| Color the button has no text or color, an empty string or
transparent white is returned respectively. The but-
ton selection is analogous to the Button-command.
IsButtonMenuItemOn On (1) if the selected menu (sub)item of a button is
"< keystr >" checked, off (0) if it is unchecked, and unswitchable
"<menupath>" (-1) if it is not checkable at all. The selection of but-
ton and menu (sub)item is analogous to the But-
tonMenu-command.
NrButtonMenuItems Returns the number of (enabled) items in the selec-
"< keystr >" ted (sub)menu of a button. The GreyOuts-command
"<menupath>?" defines whether disabled items are counted as well.
The selection of button and sub-menu is analogous
to the ButtonMenu- command. The root menu is
denoted by an empty menu- path (""). Zero is
returned if the menu-path leads to a leaf menu item.
GetButtonMenuItem Returns the text or number of the selected menu
"< keystr >" (sub)item of a button. In which form is optional and
"< menupath >" 'Text' by default. In the 'Number' case, the GreyOuts-
<formtag>? command defines whether disabled items are coun-
ted as well. Selection of button and menu (sub)item
is analogous to the ButtonMenu-command.
GetInput "< keystr >" Returns the current content of a selected input field.
<inputformtag>? In which form is optional and 'Text' by default. Selec-
tion of the input field is analogous to the Input-com-
<inputformtag> = Text | mand. In case the input field comes together with a
FilePath Select-button and perhaps an Examine-button into a
- 996 -
graphical element called uiFileInput, the 'FilePath'
option forces the current filename to be preceded by
the absolute file path to the current selection dir-
ectory. Otherwise an empty string will be returned.
GetSpin "< keystr >" Returns the text, value, minimum, maximum or step
<spinformtag>? of the selected spinbox. In which form is optional
and textual by default. Selection of the spinbox is
<spinformtag> = Text | analogous to the Spin-command.
Value | Minimum | Max-
imum | Step
GetSlider "< keystr >" Returns the text, value, minimum, maximum or per-
<sliderformtag>? centage of the selected slider. In which form is
optional and textual by default. Beware that the
<sliderformtag> = Text returned percentage of the range displayed on
| Value | Minimum | screen will not necessarily have a linear relationship
Maximum | Per- with the returned value. Selection of the slider is ana-
centage logous to the Slider-command.
GetWheel "< keystr >" Returns the text or angle (in degrees) of the selected
<wheelformtag>? thumbwheel. In which form is optional and textual by
default. Selection of the thumbwheel is analogous to
<wheelformtag> = Text the Wheel-command.
| Angle
NrComboItems "<key- Returns the number of items in a selected com-
str>" bobox. The selection of the combobox is analogous
to the Combo-command.
CurComboItem "<key- Returns the text or number of the current combobox
str>" <formtag>? item. In which form is optional and textual by default.
The selection of the combobox is analogous to the
Combo-command.
IsComboItemOn "<key- On (1) if the specified combobox item is currently
str>" <itemsel> selected, and off (0) if it is currently deselected. Spe-
cification of the combobox and its item is analogous
to the Combo-command.
GetComboItem "<key- Returns the text or number of a selected combobox
str >" < itemsel > item. In which form is optional and textual by default.
<formtag>? Selection of the combobox and its item is analogous
to the Combo-command.
NrTabs "<keystr>"? Returns the number of (enabled) tabs in a selected
tab-stack. The GreyOuts-command defines whether
disabled tabs are counted as well. The optional
- 997 -
selection of the tab-stack is analogous to the Tab-
command.
CurTab "< keystr >"? Returns the text or number of the current tab. In
<formtag>? which form is optional and 'Text' by default. In the
'Number' case, the GreyOuts- command defines
whether disabled tabs are counted as well. The
optional selection of the tab-stack is analogous to
the Tab-command.
IsTabOn "< keystr >"? On (1) if the selected tab is currently on top, and off
"<tabname>" (0) if it is currently underneath. Selection of tab-stack
and tab-name is analogous to the Tab-command.
GetTab "< keystr >"? Returns the text or number of a selected tab. In
"< tabname >" which form is optional and 'Text' by default. In the
<formtag>? 'Number' case, the GreyOuts- command defines
whether disabled tabs are counted as well. Selec-
tion of tab-stack and tab-name is analogous to the
Tab-command.
IsCanvasMenuItemOn On (1) if the selected menu (sub)item in the pop-up
"< keystr >" menu of a canvas area is checked, off (0) if it is
"<menupath>" unchecked, and unswitchable (-1) if it is not check-
able at all. The selection of canvas area and menu
(sub)item is analogous to the CanvasMenu- com-
mand.
NrCanvasMenuItems Returns the number of (enabled) items in the selec-
"< keystr >" ted (sub)menu popping up at a canvas area. The
"<menupath>?" GreyOuts-command defines whether disabled items
are counted as well. The selection of canvas area
and sub- menu is analogous to the CanvasMenu-
command. The root menu is denoted by an empty
menu-path (""). Zero is returned if the menu-path
leads to a leaf menu item.
GetCanvasMenuItem Returns the text or number of the selected menu
"< keystr >" (sub)item in the pop-up menu of a canvas area. In
"< menupath >" which form is optional and 'Text' by default. In the
<formtag>? 'Number' case, the GreyOuts- command defines
whether disabled items are counted as well. The
selection of canvas area and menu (sub)item is ana-
logous to the CanvasMenu-command.
IsShown True (1) if a selected subwindow in the workspace
< subwinsel >? of the current window is minimized, maximized, or
- 998 -
<showtag> normal size respectively, and false (0) otherwise.
Specification of size property and optional sub-
window is analogous to the Show-command. Not
selecting a subwindow will yield the size properties
of the current window itself.
- 999 -
ListMenu-command.
NrListMenuItems Returns the number of (enabled) items in the selected
"< keystr >" (sub)menu of a listbox item. The GreyOuts- command
<itemsel> "<menu- defines whether disabled items are counted as well.
path>?" Selection of the listbox, its item and a sub-menu is ana-
logous to the ListMenu- command. The root menu is
denoted by an empty menu-path (""). Zero is returned if
the menu-path leads to a leaf menu item.
GetListMenuItem Returns the text or number of the selected menu (sub-
"< keystr >" )item of a listbox item. In which form is optional and 'Text'
<itemsel> "<menu- by default. In the 'Number' case, the GreyOuts-command
path>" <formtag>? defines whether disabled items are counted as well. The
selection of the listbox, its item and the menu (sub)item
is analogous to the ListMenu-command.
- 1000 -
row, an empty string, zero or transparent white is
returned respectively. Like the CurListItem-command,
the current table row is by default defined by the item
that is 'Framed'. Optionally, if precisely one (entire row
of) item(s) is 'Selected' (i.e. highlighted), the row at
issue can be requested as current table row instead.
The selection of the table is analogous to the
TableClick-command.
CurTableCol "< key- Returns the column- header text, number or back-
str >" < curtag >? ground color of the current table column. In which form
<tableformtag>? is optional and 'Text' by default. The RGBt color string
format returned in the 'Color' case is defined at the Col-
orOk-command. If the table has no column header or
no current column, an empty string, zero or transparent
white is returned respectively. Like the CurListItem-
command, the current table column is by default
defined by the item that is 'Framed'. Optionally, if pre-
cisely one (entire column of) item(s) is 'Selected' (i.e.
highlighted), the column at issue can be requested as
current table column instead. The selection of the table
is analogous to the TableClick-command.
IsTableItemOn On (1) if a specified item in a table has been selected
"<keystr>" <tableitem- (i.e. highlighted), off (0) if it has been deselected, and
sel> unswitchable (-1) if the table does not support item
selection at all. The specification of table and item is
analogous to the TableClick-command. A row-head or
col-head item is considered selected only if all table
cells in that row or column are selected.
GetTableItem "<key- Returns the text, number or background color of a
str>" <cellsel> <table- selected table cell. In which form is optional and 'Text'
formtag>? by default. In the 'Number' case, the table cells are
counted row- by- row. The RGBt color string format
returned in the 'Color' case is defined at the ColorOk-
command. Selection of both table and cell is ana-
logous to the TableFill-command.
GetTableRow "<key- Returns the row-header text, number or background
str >" < tableitemsel > color of a selected table item. In which form is optional
<tableformtag>? and 'Text' by default. The RGBt color string format
returned in the 'Color' case is defined at the ColorOk-
command. Selection of both table and item is ana-
- 1001 -
logous to the TableClick- command. If the selected
item is not a row-head item itself, it refers to the row-
head item straight above it.
GetTableCol "< key- Returns the column- header text, number or back-
str >" < tableitemsel > ground color of a selected table item. In which form is
<tableformtag>? optional and 'Text' by default. The RGBt color string
format returned in the 'Color' case is defined at the Col-
orOk- command. Selection of both table and item is
analogous to the TableClick-command. If the selected
item is not a col-head item itself, it refers to the col-
head item left next to it.
IsTableMenuItemOn On (1) if the selected menu (sub)item of a table cell is
"<keystr>" <cellsel > checked, off (0) if it is unchecked, and unswitchable (-
"<menupath>" 1) if it is not checkable at all. The selection of the table,
its cell and the menu (sub)item is analogous to the
TableMenu-command.
NrTableMenuItems Returns the number of (enabled) items in the selected
"<keystr>" <cellsel > (sub)menu of a table cell. The GreyOuts- command
"<menupath>?" defines whether disabled items are counted as well.
Selection of the table, its cell and a sub-menu is ana-
logous to the TableMenu-command. The root menu is
denoted by an empty menu-path (""). Zero is returned if
the menu-path leads to a leaf menu item.
GetTableMenuItem Returns the text or number of the selected menu (sub-
"<keystr>" <cellsel > )item of a table cell. In which form is optional and 'Text'
"< menupath >" by default. In the 'Number' case, the GreyOuts-com-
<formtag>? mand defines whether disabled items are counted as
well. The selection of the table, its cell and the menu
(sub)item is analogous to the TableMenu-command.
- 1002 -
NrTreeCols Returns the number of columns in a tree. The optional
"<keystr>"? selection of the tree is analogous to the TreeClick-
command.
CurTreeItem Returns the text or number of the current tree item. In
"<keystr>"? <curtag>? which form is optional and textual by default. In the
<formtag>? 'Number' case, the GreyOuts- command defines
whether disabled items are counted as well. If there is
no current tree item, an empty string or zero is
returned respectively. Like the CurListItem-command,
the current tree item is by default the one that is
'Framed'. Optionally, if precisely one item is 'Selected'
(i.e. highlighted), it can be requested as current tree
item instead. The optional selection of the tree is ana-
logous to the TreeClick-command.
CurTreePath "< key- Returns the path to the current tree item. In which form
str >"? < curtag >? is optional and textual by default. The 'Number' case
<formtag>? is especially useful if 'Text' would yield an ambiguous
tree path. The GreyOuts-command defines whether
disabled items are counted as well. If there is no cur-
rent tree item, an empty path ("") is returned. Like the
CurListItem- command, the current tree item is by
default the one that is 'Framed'. Optionally, if precisely
one item is 'Selected' (i.e. highlighted), it can be
requested as current tree item instead. The optional
selection of the tree is analogous to the TreeClick-
command.
CurTreeCol Returns the column text or number of the current tree
"< keystr >"? item. In which form is optional and textual by default.
<formtag>? Note that the current tree item is merely column-spe-
cific in case of the default 'Framed' setting. The
optional selection of the tree is analogous to the
TreeClick-command.
IsTreeItemOn "< key- On (1) if a specified item in a tree has been selected
str>"? "<treepath>" (i.e. highlighted), off (0) if it has been deselected, and
unswitchable (- 1) if the tree does not support item
selection at all. Specification of the tree and its node
is analogous to the TreeClick-command, except that
column selection is not an issue here.
IsTreeItemExpanded True (1) if a specified item in a tree has been expan-
"< keystr >"? ded, false (0) if it has been collapsed, and unex-
- 1003 -
"<treepath>" pandable (- 1) if it is a leaf node. Beware that an
expanded tree node can have a collapsed ancestor.
Specification of the tree and its node is analogous to
the TreeClick-command, except that column selection
is not an issue here.
GetTreeItem Returns the item number of a tree node or the text in
"< keystr >"? < treen- one of its columns. In which form is optional and tex-
odesel> <formtag>? tual by default. In the 'Number' case, the GreyOuts-
command defines whether disabled items are coun-
ted as well. The selection of the tree, its node and
column is analogous to the TreeClick-command.
GetTreePath Returns the path to a selected tree node. In which
"< keystr >"? form is optional and textual by default. In the 'Number'
"< treepath >" case, the GreyOuts- command defines whether dis-
<formtag>? abled items are counted as well. It can be used for
converting a path from one form to another. Spe-
cification of the tree and its node is analogous to the
TreeClick-command, except that column selection is
not an issue here.
GetTreeCol Returns the text or number of a selected tree column.
"< keystr >"? <tree- In which form is optional and textual by default. In the
colsel> <formtag>? 'Number' case, the GreyOuts- command defines
whether disabled items are counted as well. Selec-
<treecolsel> = tion of the tree, its column and an optional node is
<colsel> | PathCol analogous to the TreeClick-command. If the column
"<treepath>" <colsel> selection is made by name, the search for a match
will start in the column header. If not successful, it pro-
ceeds at the specified tree node in the 'PathCol' case,
or traverses all tree nodes breadth-first otherwise.
IsTreeButtonOn On (1) if the button in front of a tree node is checked,
"< keystr >"? and off (0) if it is unchecked. Selection of both the tree
"<treepath>" and its node is analogous to the TreeButton- com-
mand.
IsTreeMenuItemOn On (1) if the selected menu (sub)item attached to a
"< keystr >"? < treen- column of a tree node is checked, off (0) if it is
odesel > unchecked, and unswitchable (-1) if it is not check-
"<menupath>" able at all. The selection of the tree, its node and
column, and the menu (sub)item is analogous to the
TreeMenu-command.
NrTreeMenuItems Returns the number of (enabled) items in the selected
- 1004 -
"< keystr >"? < treen- (sub)menu attached to a column of a tree node. The
odesel > GreyOuts-command defines whether disabled items
"<menupath>?" are counted as well. Selection of the tree, its node
and column, and a sub- menu is analogous to the
TreeMenu-command. The root menu is denoted by an
empty menu-path (""). Zero is returned if the menu-
path leads to a leaf menu item.
GetTreeMenuItem Returns the text or number of the selected menu (sub-
"< keystr >"? < treen- )item attached to a column of a tree node. In which
odesel > form is optional and 'Text' by default. In the 'Number'
"< menupath >" case, the GreyOuts- command defines whether dis-
<formtag>? abled items are counted as well. The selection of the
tree, its node and column, and the menu (sub)item is
analogous to the TreeMenu-command.
Expressional specifications
- 1005 -
left-to-right < expr > % Modulo
<expr>
4 left-to-right < expr > + Addition
<expr>
left-to-right < expr > - Subtraction
<expr>
5 left-to-right < expr > < Less than
<expr>
left-to-right < expr > <= Less than or equal
<expr>
left-to-right < expr > > Greater than
<expr>
left-to-right < expr > >= Greater than or equal
<expr>
6 left-to-right < expr > == Equality (numerical if
<expr> possible, string oth-
erwise)
left-to-right < expr > != Inequality (numerical if
<expr> possible, string oth-
erwise)
7 left-to-right < expr > && Logical AND
<expr>
8 left-to-right < expr > || Logical OR
<expr>
9 right-to-left < expr > ? Conditional operator (if-
< expr > : then-else)
<expr>
- 1006 -
)
ceil( <expr> ) Smallest integer not less than
cos( <expr> ) Cosine
exp( <expr> ) Exponent
floor( <expr> ) Largest integer not greater than
ln( <expr> ) Natural logarithm
log( <expr> ) Base-10 logarithm
rand ( <max_ Uniform random value between 0 and optional maximum, 1
expr>? ) by default
randG ( Gaussian random value with mean 0 and optional standard
<stddev_ expr>? deviation, 1 by default
)
round( <expr> ) Round to nearest integer
sgn( <expr> ) Sign (1 if greater than zero, 0 if zero, -1 if less than zero)
sin( <expr> ) Sine
sqrt( <expr> ) Square root
tan( <expr> ) Tangent
trunc( <expr> ) Round to integer in direction of zero
- 1007 -
<expr> ) erwise
isAlpha( <expr> True (1) if all characters are letters, false (0) otherwise
)
isDigit( <expr> ) True (1) if all characters are digits, false (0) otherwise
isInteger ( True (1) if representing an integer, false (0) otherwise
<expr> )
isLower( <expr> True (1) if all characters are lower-case letters, false (0) oth-
) erwise
isNumber ( True (1) if representing a number, false (0) otherwise
<expr> )
isSpace( <expr> True (1) if all characters are white space, false (0) oth-
) erwise
isUpper( <expr> True (1) if all characters are upper-case letters, false (0) oth-
) erwise
strCat( <expres- String concatenation
sions> )
strLen( <expr> ) String length in characters
strSel ( <str_ Character selection. Last position is optional. Negative pos-
expr>, <firstpos_ itions count in reverse.
expr>, <lastpos_
expr>? )
sepStrCat ( Concatenation of separation-strings (menu- and tree-paths,
<expressions> ) RGB(t) color strings)
sepStrLen ( Number of separated substrings
<expr> )
sepStrSel ( <str_ Selection of separated substring (s). Last position is
expr>, <firstpos_ optional. Negative positions count in reverse.
expr>, <lastpos_
expr>? )
toLower( <expr> Converts all letters to lower-case
)
toUpper( <expr> Converts all letters to upper-case
)
wildcard ( Matching substring for a wildcard in the latest successful
<selnr_expr>? ) wildcarded command action. Default selection number is 1.
A negative number counts the wildcards in reverse.
wildcardStr ( The whole matching string around a wildcard in the latest
<selnr_expr>? ) successful wildcarded command action. Wildcard selection
is analogous to the wildcard()-function above.
- 1008 -
All predefined identifier constants and environment variables are listed below. The
Command Driver will automatically change Unix-style file-paths into Windows-
style file-paths on Windows-platforms and vice versa. Redefining any predefined
identifier will result in a warning, but is not forbidden. It allows the command script
to overrule the values of the directory identifiers ( $...DIR$ ) as set by the system
environment in which OpendTect is running. Or you can reset the "increment"-iden-
tifier FILEIDX with the start value you like. Beware that most of the predefined iden-
tifier constants are unfit for change. They are only there to make command scripts
more readable. The Command Driver will keep using the original values internally!
For example, you can exchange the values of TRUE and FALSE, but the function
call isAlpha("a") still returns 1. Therefore, an expression like isAlpha("a")==TRUE
would suddenly get an opposite meaning.
FALSE 0
PI 3.14159265 ... Trigonometric constant
UNDEF 1e30 OpendTect's undefined
value. Treated as such by
all logical, mathematical,
and statistical operators
and functions defined
above.
SUCCESS 1 Possible results of 'Try'-
command
FAILURE 0
WARNING -1
ON 1 Possible results of any
'Is...On'-question command
OFF 0
UNSWITCHABLE -1
BASEDIR Base data directory (
setenv DTECT_
[WIN]DATA )
DATADIR $BASEDIR$/<cur_ sur- Survey directory
- 1009 -
vey>
PROCDIR $DATADIR$/Proc Processing directory
APPLDIR Installed software directory
( setenv DTECT_
[WIN]APPL )
USERDIR Personal home directory (
setenv DTECT_
PERSONAL_DIR )
SCRIPTSDIR $PROCDIR$ Overruled by setenv
DTECT_SCRIPTS_DIR
SNAPSHOTSDIR $DATADIR$/Snapshots Overruled by setenv
DTECT_ SNAPSHOTS_
DIR
IMPORTDIR $DATADIR$/Import Overruled by setenv
DTECT_IMPORT_DIR
EXPORTDIR $DATADIR$/Export Overruled by setenv
DTECT_EXPORT_DIR
FILEIDX 1000++ Integer variable that is auto-
matically incremented after
every occurrence in the
command stream. It may be
substituted in file- paths to
generate unique filenames.
One of the recurring user questions about OpendTect is whether there is a quick
way to do some kind of repetitive task. Loading a huge amount of wells, importing
multiple 2D line SEG- Y data, etc. Such a service can only be offered by
OpendTect itself if the task is very simple, common and straightforward. For
example, one does have the possibility to select multiple horizons with the mouse
in order to load them in one go. However, if the repetitive task is more complex,
unique, or variant, the workflow can be automatized by means of a command
script.
- 1010 -
Automizing a repetitive task consists of three stages. Firstly, the Command
Recorder is applied to record the mouse and keyboard actions needed to perform
the task once. Secondly, a text editor is used to modify the recorded script. Some of
the recorded actions have to be generalized, and a few new commands have to be
added to make the script iterative. These commands are listed in Table III. Thirdly,
the Command Driver is applied to run the modified script, initially to debug it and
finally to perform the repetitive task.
Listed below is the recorded command script that makes a snapshot of one attrib-
ute on one 2D line. The passages that need to be generalized have been printed
in red.
[Create snapshot]
Button "Screen" On
Input "Select filename" "/d43/-
jaap/surveys/Demo2D/Snapshots/dump.png" Hold
Button "Ok"
Listed below is the modified command script after it has been generalized and
made iterative. All changes with regard to the originally recorded script have been
printed in green.
- 1011 -
"Display`2D Viewer - VD"
If res==FAILURE ; Return ; Fi
[Create snapshot]
Button "Screen" On
Input "Select filename" "$SNAPSHOTSDIR$/$pic_name$"
Hold
Button "Ok"
For setidx = 1
For nameidx = 1
For attridx = 1
res ? dumpAttribute( setidx, nameidx, attridx
)
If !res ; Break ; Fi
Rof
If attridx==1 ; Break ; Fi
Rof
If nameidx==1 ; Break ; Fi
Rof
- 1012 -
l Substitution ( $...$ ) of identifier values into command actions.
l Backslash ('\') to spread a long command over multiple lines.
l If-command to execute command actions conditionally.
l Use of predefined identifiers: constant FAILURE and environment variable
SNAPSHOTSDIR.
l Semicolons (';') to separate multiple commands on one line.
l Return-command to terminate a procedure immediately.
l Use of assignment and built-in function calls ( pic_name = strCat(...,...) ).
l Wildcard(.)-function to get matching strings from the latest successful wildcarded com-
mand action. Note that these strings must be secured before the next use of a wild-
card ( i.e. in window assertion [2D Viewer - Line: *] ).
l (For)-loops to make a script iterative.
l Call to a user-defined procedure ( res ? dumpAttribute(...,...) )
l Break-command to escape from loops immediately.
Note that different operators are used to store the result of an expression (=), a com-
mand action (~), or either a user-defined procedure (?) or a question command (?)
into an identifier. This eases the parsing of commands by the Command Driver,
and it should make the user aware that the allowed complexity of (sub)-expres-
sions does not go beyond the built- in function calls. The results of command
actions, user- defined procedure calls, and question commands have to be
assigned to auxiliary identifiers first.
The introduction of question commands allows the body of the previous script to be
written in a different style. An identifier can take the answer to one of the many
questions (Nr..., Cur..., Is..., Get...) from Table IV to VII about some property of a
user-interface element. Instead of applying the Break-command to escape from
loops when the procedure call to dumpAttribute (...,...) fails, now the number of
items to iterate over can be asked and set before entering a loop ( nritems ?
NrTreeItems ... ). Note that a procedure defined with a return parameter can also be
called without. Listed below is the restyled body of the command script above.
- 1013 -
dumpAttribute( setidx, nameidx, attridx )
Rof
Rof
Rof
In the 'doc' directory of the release, you can find a 'Scripts' subdirectory. It contains
the standard test scripts for OpendTect. These test scripts all work on a survey 'F3_
Demo', the demo data set for OpendTect.
The directory contains several scripts, many of which can be run stand-alone, but
certainly not all. There is also an execute-all script: 'AllScripts.cmd'. Another com-
posite script is 'AllAttributes.cmd', which will make snapshots in the Snapshots dir-
ectory. This Snapshots directory is created automatically; the location is your_
surveys/F3_Demo/Snapshots. The snapshots are created as an index followed by
the file name.
Some scripts are dependent on PlugIns like SSIS, VMB etc. These scripts are loc-
ated in dgb/doc/Scripts. To run these scripts you should make sure the related plu-
gins are loaded.
The CmdDriver plugin also offers the possibility to record the user action history in
the background from the moment OpendTect is started. It will be stored in the file
userhistory.odcmdin the Proc directory of the starting survey. It is not guaranteed
that the Command Driver can offer a full reproduction of the past by running this
file, since not all possible user actions are covered yet. Mouse actions performed
in 3D scenes or 2D viewers are recorded nor executed for the time being. Actions
performed in the Command Controller window are recorded, but not executed as
long as the difficulty of calling the Command Driver recursively has not been
solved. Nevertheless, the recorded history can be of great help in reproducing a
bug or crash reported by the user.
- 1014 -
In order to enable recording of the user history, the user setting dTect.User history
buffer must be set with a value other than zero. The magnitude of this value defines
the size (in characters) of the buffer in which the user actions are stored tem-
porarily. A positive size value means flushing the content of the buffer to file every
time it overflows. A negative size value means dropping the oldest content once
the buffer starts overflowing, and flushing only the newest actions to file when
OpendTect finally exits or crashes. The menu item Utilities->Settings->Advanced-
>Personal settings will pop-up a dialog in which this user setting can be added or
adapted.
- 1015 -
13 Appendix C - SEG-Y Check-
list
This document contains examples of SEG-Y loading problems and proposes solu-
tions for the most often encountered problems.
In all cases, the SEG-Y import tool must be launched from the survey menu (Import
--> Seismics --> SEG-Y). Enter the settings needed for the first step and press OK.
In the next window (main import window) press scan and perform the scan. An
examine window will pop up. Display the first traces in the 2D viewer.
Once you have this on screen check the most appropriate situation in the list
below:
Excel utilities
- 1016 -
SEG-Y checklist
The top of the examine window may look like any of those two windows, although
there is an enormous variety in textual headers:
- 1017 -
- 1018 -
- 1019 -
The left picture is a very common empty (automatically filled) textual header. Some-
times an operator fills the empty parts, but you can expect mistakes to occur. The
second picture is a textual header from an OpendTect exported SEG-Y file. All the
fields were directly copied from the project database (additional edits are possible).
l Problem: If you do not see the above described structure you might not be reading a
SEG-Y file at all, but maybe a seismic file written in a different format (binary, SEG-D,
...).
l Solution: Translate your file if possible.
l Problem: Some characters in the textual header are weird.
l Solution: They were badly translated from EBCDIC to ASCII (or the opposite), either
when writing the file or when reading it. Unfortunately there is no 1 to 1 translation
between EBCDIC and ASCII, therefore this issue cannot be solved. That is why
EBCDIC coding has been banned from the SEG-Y revision 1 norm.
The line header overview is provided in the top part of the examine window below
the textual header. Use the right scroll bar to reach it. Only the non-zero values are
shown, on the contrary to the other headers where the entire content is reported. A
correct line header will look like this in the OpendTect examine window:
- 1020 -
The title line, field names (first column), byte offsets (second column) and explan-
ations between brackets (fourth column) are provided by OpendTect. Only the num-
bers in the third column originate from the file. The byte numbers indicate where
those values were found.
Look at the field isrev1 (301): If the value is zero your file is revision 0 compliant.
If the value is equal to 1 then your file in (normally) revision 1 compliant. This
enables to answer to the revision 1 question.
Problem : If none of the values are reasonable but contain a lot of 256 and
16777216, or in general if none of the numbers make sense, the file might be byte-
swapped (i.e. written using "little-endian" byte ordering instead of "big-endian").
See example below:
l Solution 1: In that case cancel the import window (that will bring you back to the first
step), and put the "Byte swapped" parameter to "all".
l Solution 2:.If this is not enough then the textual and/or line headers might be cor-
rupted. Try to re-create your SEG-Y file if possible.
>
Problem: The line header is truncated: The line header should be 400 bytes long.
It will not be possible to deduct the start position of the first trace if any of the first
3600 (=3200 + 400) bytes are missing. If the line header is truncated then you
should see a lot of non- standard entries in the examine window, like in the
example below:
- 1021 -
l Solution: Use the excel utilities to determine if your file may be missing some bytes. If
that is the case you need to re-create your file unless you know exactly how many
bytes are missing and how to lengthen your line header.
l Problem: The bytes are swapped.
l Solution: See above.
The first trace header is readable but no other - zero sample rate - last trace
incomplete
This happens when the trace size could not be computed successfully. The trace
size is function of the sample size (format) and number of samples. The problem
then occurs when either of those variables was not correctly written in the headers.
Action 1 : Check if the sample size (format) is correct in the line header: The
sample format is reported in the examine window in the line header in front of the
"format" field. It must report a value of 1, 2, 3, 5 or 8. If not the following warning will
be shown when scanning the file or loading it:
- 1022 -
l A value of 1, 2, 5 represents a sample size of 4 bytes.
l A value of 3 represents a sample size of 2 bytes.
l A value of 8 represents a sample size of 1 byte.
Furthermore the warning "Err: Warning: replacing zero sample rate with survey
default" is printed (but that is not the only cause) if the actual sample size is larger
than the sample size deducted from the headers of overruled.
On the contrary the warning "last trace incomplete" is printed (but that is not the
only cause) if the actual sample size is smaller than the sample size deducted from
the headers or overruled.
l Solution: Overrule the SEG-Y format to another format until the line header and all
trace headers are readable in the examine window. Scan your file and check the out-
put amplitudes since three different formats are available for a sample size of 4 bytes.
A proper display of the data in the 2D viewer will indicate a success.
l Action 2: Check if the number of samples was correctly extracted from either the line
or trace headers: The number of samples is reported at multiple positions in the SEG-
Y file:
l In the examine window in the line header overview in front of the "hns" field.
l In each trace header at byte offset 115 (field "ns").
l OpendTect will only use the number of samples defined in the trace headers. There-
fore the line header field "hns" might be missing or in contradiction with the trace
headers, with no consequence upon the data loading as long as the trace headers
"ns" field are correctly written.
l Solution: Overrule the SEG-Y number of samples to another value until all trace
headers are readable. The excel utilities might be helpful to check the correct value
and to check if the file is not missing some bytes.
- 1023 -
The coordinates may be found in each trace headers on bytes 73 (CDP-X) and 77
(CDP-Y) or 181 and 185 respectively with SEG-Y revision 1 files. However those
values will be scaled during scan and loading by the scaler coordinate that should
always be found on bytes 71-72.
l Problem: The coordinates found in the trace headers do not make sense: Your file
might contain coordinate trace header encoded as floats instead of integers. This is
not allowed by any SEG-Y standard, although still being encountered sometimes.
l Solution: Get a SEG-Y compliant file.
l Problem: The unscaled coordinates are reasonable but after scan/loading the scaled
coordinates are not correct.
l Solution: Overrule the scaler coordinate to the right value in the import window. You
need to specify a number that when multiplied by the trace header coordinate will
return the actual (scaled) coordinate. Therefore the trace header scalco "-10" would
have to be overruled by 0.1 in order to get the same scaling.
Please note the it is not possible to apply a static shift (easting/northing) to the
coordinates, however this is not needed by OpendTect. Nevertheless you can load
the SEG-Y file first and apply your shift to the survey coordinates afterwards in the
survey definition window.
This is linked to the sample format not being correctly set. The three formats 1, 2,
5 code the data on four bytes. You may need to switch and overrule between 1, 2
and 5.
Neither a SEG-Y file nor and OpendTect volume need to be rectangular, i.e. they
do not need to contain all traces of rectangular survey. This is normal, except if you
expect a rectangular volume. Please note that the default setting in OpendTect is
to dismiss null traces, i.e. traces where all samples have a zero value.
l Problem: The warning "during import 123450 traces were rejected" appers when load-
ing the file.
l Solution 1: Display the loaded file. The warning sometimes pops up by mistakes such
that your file may be already correctly loaded.
l Solution 2: Make sure that the survey area is large enough to accommodate your new
volume.
In general
- 1024 -
Solution: Use the excel utilities to compute the number of traces you can expect to
have in your SEG-Y file, and compare it with the actual number of loaded traces,
reported in the scan report. OpendTect will be able to load all full traces until the
first missing byte is found in the input file or until the end of the file, even if the end
of the file is in the middle of a trace. In that case only the last non-complete trace
will not be acccessible.
The loaded volume is shifted with respect to the others - does not start at
zero
Sometimes the first sample does not correspond to the time or depth 0. If that is the
case the corresponding time or depth should be reported in each SEG-Y trace
header at bytes 109-110 (delrt) and/or 105-106 (laga) with the opposite polarity like
in this example:
"laga" is equal to -200, delrt is equal to 200: both mean that the first sample cor-
responds to time 200ms.
- 1025 -
l Problem : The start time is not specified in the trace headers but is expected to be dif-
ferent that zero.
l Solution: Overrule the start time parameter in the import window.
l Problem: An incorrect start time was applied during loading.
l Solution: Either re-import the file by overruling the start time or use the reference shift
attribute to apply a static shift to your traces.
Please note that in all cases you must have a priori knowledge of the start time.
- 1026 -
Excel utilities
l Compute the SEG-Y file size based on the sample format, number of traces and num-
ber of samples. A successful application will be the indicator of a SEG-Y file without
missing bytes.
l Compute the number of traces present in the SEG-Y file based on its size, the sample
format and the number of samples per trace, assuming a constant trace length. If the
returned number os an integer the SEG-Y file does not contain holes (except for a
very unlikely occasion).
Go to the second tab in the excel sheet to fill in the trace numbers.
- 1027 -
- 1028 -
14 Appendix D - Wacom Digit-
izing Tablets
l Introduction
l Pen Device
l Basic Interaction
l Draw Polygons
l Create Bodies
l Interpret Horizons
l Manually Edit Horizons
l Interpret Faults
l Supported Platforms
- 1029 -
Introduction
OpendTect combined with a Wacom Tablet has become a key application for
hand-eye coordination. It has proven to be vital in manual interpretations with
OpendTect. Many of the OpendTect interpretation- workflows were modified in ver-
sion 4.2 for optional use of a tablet device. This documentation is thus written to
provide a brief introduction on seismic interpretation on a tablet. The doc-
umentation assumes basic familiarity with the OpendTect environment.
Before you start using the documentation, a general introduction will be given on
the pen device and the features which are supported by OpendTect.
Pen Device
The following list describes how the pen device replaces traditional mouse fea-
tures:
l Mouse clicks are replaced with the clicks being made by the tip of the pen device
l The mouse drag function is replicated by dragging the tip of the pen on the tablet
device
l The right/left/double click buttons are supported via the DuoSwitch
- 1030 -
The only thing required is to hold the pen device in your hand and begin inter-
preting the seismic data. If you want to draw a seismic object e.g. a hori-
zon/polygon, you will simply need to drag the tip with a light pressure on the tablet.
Light pressure here simply means that you touch the tablet screen using the tip of
the pen. The tablet is pressure sensitive and will automatically detect that the user
is intending to draw on screen. It will convert the screen coordinates back to the sur-
vey coordinates and will store the object.
Furthermore, the Eraser can be used to remove (parts of) the interpretation (spe-
cifically, the seeds). The eraser feature is accessed either by clicking on the node/-
seed to remove it, alternatively by rubbing it over the interpretation (drawn lines).
The DuoSwitch is used to launch pop-up menus or for double clicking on an ele-
ment. The single left mouse-button click is accessed through the tip of the pen (by
'tapping' on the screen).
Precautions:
l When the pen is not in use, please place it back in the pen stand
l Please avoid using the mouse while the pen device is in your hand. You may either
use a pen supported by the Wacom tablet or regular installed mouse
Basic Interaction
- 1031 -
Menus/Icons:
The menus/icons are clicked using the tip of the pen device. The tip is tapped over
the (sub ) menu/icon to launch the corresponding dialog/application.
Display an Element:
To display an element in a scene, you will need to use the Tree of OpendTect. For
instance, to display an Inline, simply place the tip of the pen at inline and select the
Add option.
Pop-Up Menus:
Pop-up menus are launched using the lower button of the DuoSwitch. The pop-up
menu options are selected by tapping the item with the tip of the pen.
Lines/Seeds/Points:
The lines, seeds or points (pointsets) in OpendTect are drawn by dragging the tip
of the pen over the displayed element in the 3D scene, similarly to drawing with a
- 1032 -
pen on regular paper. The drawn line/point data can be removed interactively
using the Eraser at the end of the pen.
Draw Polygons
The first example in this tutorial is creating a simple polygon on a horizon on the
tablet device. Although the workflow is simple, several options and features are
introduced so that you will familiarize yourself with other important features sim-
ultaneously. Please follow the steps set out below:
In the following workflow, use the pen device instead of mouse. Tap/Press/Select
in this workflow refers to the tip of the pen device.
First, display a seismic horizon. Tap the tip of the pen on the Horizon element in
the tree; a pop-up sub-menu appears
Select Load
The Horizon Selection dialog is then launched from where you can select one or
more saved horizons
Once a seismic horizon on which you want to draw a polygon is displayed, you
can continue to the next steps; drawing a polygon
- 1033 -
5. The Polygon Creation dialog box is opened, type a name for the new polygon. For typ-
ing, you may use either the conventional keyboard attached to your computer or the
virtual keyboard supported in OpendTect. The virtual keyboard is launched using the
lower DuoSwitch button (equivalent to the right-click button on a conventional mouse)
whilst pointing the pen at the name field. Once done typing, simply close the virtual
keyboard: the new polygon name is automatically inserted in the field
6. Hit Ok in the parent Polygon Creation dialog
7. A blank polygon is added in the tree, with the given name (step 5)
8. Optional: before drawing on the horizon, change to map-view by selecting View
North-Z in the main toolbar to the left
9. Make sure the polygon element in the tree is active (tap it) and start drawing the poly-
gon on the horizon. Two methods are available: (1) drag and release, in which you
will have to drag the pen over the area where you want to draw a polygon or (2) tap
pen, where you tap on the horizon to insert seeds, the seeds are then connected auto-
matically
10. Whilst drawing the polygon, unwanted points can be removed with the eraser at the
end of the pen
11. Finally, close the polygon: Right click (lower button on the DuoSwitch) on the polygon
name in the tree and select Close Polygon option from the pop-up menu.
- 1034 -
Create Bodies
Bodies are easily created after drawing a polygon. The workflow requires a com-
bination of a saved polygon and top and bottom horizons between which the body
will be created.
1. Display a stored (Load) or draw (New) a polygon by following the workflow above
2. From the polygon pop-up menu (use the lower DuoSwitch), select Create body
- 1035 -
3. In this dialog, select top and bottom horizons
4. Hit the Ok button
5. The above step (4) creates the body with an empty name and displays it in the scene.
In the tree, the <New MCBody 1> sub element under the Body menu appears. Save it
by launching the pop-up menu for the body by using the lower DuoSwitch
Interpret Horizons
Seismic horizon interpretation is fast and convenient on the Wacom tablet device.
This is mainly due to the human adaptability with the pen. To help you get started
with interpreting seismic horizons with OpendTect on the tablet device, simply grab
your pen and follow the workflow set out below.
Please use the pen device instead of mouse in the following workflow.
Tap/Press/Select all refer to the tip of the pen device.
1. A flat (orthogonal) display has proven useful when interpreting horizons on seismic
data. This view enables you to view the inlines/crosslines/z-slices as '2D' planes. Sim-
ply switch the default perspective view to orthographic:
Use the pen device to tap the button to toggle the orthographic view on/off.
2. If you wish to interpret horizons on inlines, you may want to select the View Inline dis-
play option from the Graphical Toolbar.
3. Optional: You may also adjust the zoom of the display using the touch strips available
on the back side of the table.
- 1036 -
4. Display seismic data in the scene by tapping either Inline or Crossline in the tree. By
default data is being displayed as a white colored 'empty' element in the centre of the
survey. Select data by tapping the <right-click> sub-element. Use the lower
DuoSwitch to access the sub-element and select seismic data to be displayed (as
shown below).
Once the seismic data is loaded, you can proceed to the next step
5. Use the pen tip to add a new horizon in the tree. Tap the Horizon element in the tree
using the pen. A drop-down list appears; select New...
6. A new horizon is added in the tree, labeled <New Horizon 1> by default. A pop-up dia-
log also appears (i.e. the Tracking Setup).
- 1037 -
7. The tracking setup window features four tabs, presented briefly below. Please refer to
the general help documentation of OpendTect for detailed descriptions.
Mode: Select tracking mode i.e. auto-tracking, tracking on a line, or tracking manu-
ally.
Event: Used if the mode is set to either auto-tracking or line tracking. Select seis-
mic data in the Input data field and provide the event type (Peak/Trough/Zero-cross-
ing). Please, note that Max refers to peak, Min refers to trough, 0+/- refers to
positive to negative zero crossing, and 0-/+ refers to negative to positive zero cross-
ing. Use the default search window as a starting point. These settings can be
accessed and changed later. Also, use the default step-wise-tracking options i.e.
Relative (amplitude) difference of 1, 2, 5, 10, 20 …%. Leave the other settings to
default.
Similarity: Matches the seismic events based on the picked seeds and searches
for the corresponding signal in the immediate vicinity. Please, leave this blank for
now. [Tip: For fast tracking in a good quality area, reduce the threshold (e.g. 0.5),
and the time gate (e.g. -16 and +16).
- 1038 -
Properties: Change horizon colour and the seed properties (e.g. size, colour and
shape).
8. By default the seed mode is toggled on in the OpendTect tracking toolbar (shown
below). Note that volume tracking is set on by default (i.e. tracking in a small sub
volume)
9. Next, start interpreting by picking a seismic event using the pen device. [Tip: Drag the
pen over a seismic event if the area is coherent and of good quality, otherwise use the
pen clicks on the event to drop seeds]
10. Move the inline position and continue interpreting over the entire survey.
11. In order to remove a seed, simply use the Eraser on the end of the pen. Optionally,
use the ExpressKeys on the tablet device (subject to availability).
- 1039 -
Pen tap on a plane = pick and local track
Pen tap with Ctrl + Shift express keys pressed = drop/undrop a seed at a pen loc-
ation
Pen tap with Ctrl express key = remove the seed and track locally
Pen tap with Shift express key = remove the seed and erase auto-tracking from that
seed until the next seed(s)
12. After interpreting a good part of the survey area, test the auto-track settings by spe-
cifying a small auto-tracking area. Select the show tracking area button in the track-
ing toolbar.
13. A 3D boundary box appears around the interpreted horizon. Use the green anchors at
the corners of the box to re-size or move it: place the pen tip at one of the anchors and
drag in the respective directions (diagonal anchors resize the entire box in 3D with
equal proportion, whilst the others stretch/squeeze the box horizontally/vertically).
[Tip:toggle view mode to perspective to see the whole box.]
14. Once the tracking area is defined, tap anywhere in an empty area in the scene to
read/load seismic data within the specified tracking volume. [Tip: Preferably, tap out-
side the survey area with a zoomed-out view.]
15. Press the auto-track button to start auto-tracking the horizon within the tracking
area. [Tip: It is recommended to save the raw interpretation as a separate file before
attempting to auto-track.]
Manual Tracking:
16. Manual tracking is accessed by changing the tracking setup to Line manual
- 1040 -
17. With Line manual selected, you will just need to draw horizons using the pen [Tip:
Drag the pen over the event and release once done.]
18. Move about 5 or 10 inlines (or cross-lines) forward/backward and repeat manual inter-
pretation. [Tip: Instead of interpreting the data in 3D, you may interpret the data in a
2D viewer]
Save Horizon(s):
Click on the save button that is present in the tracking toolbar to Save the <New
Horizon 1>>, alternatively right click the horizon in the tree menu and choose either
Save or Save as...
Save Session:
Save the session to continue interpreting the horizon at a later time if not com-
pleted
This is an important exercise where you will learn how to edit a Horizon in
OpendTect with your tablet device.
1. In map view, draw a polygonal area using the polygon selection tool (from the
Graphical Toolbar to the left).
2. Click the horizon from the tree (to activate it) and press the trash icon to remove the
outlined portion of the horizon
- 1041 -
3. Now switch to inline/cross-line view in the area of the removed polygon
4. Enable the tracking mode; use the DuoSwitch to launch the use the pop-up menu of
the horizon and select the Enable tracking option under the Tracking sub-menu. This
will enable the tracking controls and now you can either edit the horizon in the 3D
scene or in a 2D Viewer
5. From the tracking controls available at the bottom of OpendTect by default, set setup
to Line manual
- 1042 -
6. In this exercise, we will edit the horizon in a 2D viewer:
7. Display the inline in a 2D Viewer (use the lower DuoSwitch as shown above)
8. In the 2D viewer, you will have to display the same horizon
9. Toggle Edit mode ON (or as shown below)
10. Now, start editing the horizon by drawing over an event using the pen
11. Move the inline/cross-line to next/previous 10 lines, and repeat the interpretation
12. Save the horizon after editing. This can be done directly from the tree available in the
2D Viewer
- 1043 -
- 1044 -
Interpret Faults
In this manual, you will learn the use of both mentioned methods of interpreting
faults in a 3D survey.
In the following workflow, use the pen device instead of mouse. Tap/Press/Select
in this workflow refers to the tip of the pen device.
1. Display an inline (or a cross-line) in the scene. [Make sure that the seismic data has
already been displayed along the displayed inline]. Optional: Position the inline to a
location at where you want to start the 3D fault interpretation. To position, you may
use the slice position controls
- 1045 -
2. Click on the Fault element to add a New Fault sub-element in the tree. Next, make
sure that it is selected / active
3. Now in the scene, start drawing the fault stick on the inline. [Drag the pen over the
inline at fault plane location]
4. To remove a seed of a fault plane, you may use the Eraser of the pen
5. To position a seed to a new location, you may move the seed by clicking and drag-
ging it in any direction
6. Move (step) the inline to the next position to interpret another stick of the fault in a new
location (5 or 10 inlines forward or backward or smaller steps if continuation is
unclear). [Tip: Display the fault plane on sections online. For this use may use the
lower button of the DuoSwitch to launch the pop-up menu. In the pop-up menu please
select the Display option]
7. Repeat the steps to interpret the fault on other inline/cross-lines
8. Use the lower button of DuoSwitch to Save the <New Fault 1>
- 1046 -
1. Preload the seismic data on which you are intended to interpret faults. [Tip: Use the
Survey menu i.e. Survey > Preload > Seismics]
2. Display an inline (or a crossline) in the scene. [Make sure that the seismic data has
already been displayed along the displayed inline]
3. Click on the Fault element to add a New Fault sub-element in the tree. This will add a
<New Fault 1> fault under Fault element. [Make sure that it is clicked / active]
4. Use the lower button of DuoSwitch on the Inline to launch the drop-down list and
select Position... In the Positioning dialog, please click on the Scroll button
- 1047 -
5. In the scrolling dialog, set the scroll step (i.e. number of inlines/cross-lines to move,
use positive number for forward scrolling and negative number for backward scrolling)
and time to scroll the inline to the next position (use for example 5 seconds)
This workflow allows for interpreting fault sticks only, which can later be converted
to 3D fault planes. The benefit of this workflow is that you can interpret multiple
sticks on an inline/crossline.
1. Display an inline/crossline in the scene. [Tip: Click on the Inline element to add a
new inline]
2. Add a new FaultStickSet in the tree. [Click on the FaultStickSet, and select the Add
option in the pop-up menu.]
3. Start drawing multiple fault sticks in the scene. To split sticks, use lower DuoSwitch
- 1048 -
4. If you want to move a node of a fault stick, place the tip of the pen over the node to be
modified. Click and drag the node in 3D and position it to a correct location
5. Save the FaultStickSet by launching the drop-down list [Tip: Use the lower
DuoSwitch button.]
6. Move the inline/crossline to the next position and continue the interpretation
7. While moving the inline/crossline, you may still observe the sticks from previously
interpreted sections. To hide them (and therefore avoid confusion), please display the
fault sticks at sections only. [Tip: Use the pop-up menu for the fault stick set.]
To convert fault sticks into fault planes, you will need to familiarize yourself with the
Fault Sticks Toolbar (shown below). By default, this toolbar appears at the bottom
of the OpendTect window.
1. Display a time slice of similarity attribute, so that you can identify the fault trends
2. Display the fault sticks in a 3D scene. [Tip: Use the pop-up menu for the fault stick
set.]
- 1049 -
3. Optional: Position the time slice (step-a) at where you can see the tops of the fault
sticks
4. Activate the select sticks button
5. Make sure that you are in interact mode
6. In the scene, draw a polygon to select the sticks that you want to convert into a fault
plane. [Use the pen device and draw a red colored polygon.]
7. Once the tip of the pen is lifted away from the tablet, observe that the sticks within the
polygon turn green. This means that the sticks have been selected (and can be con-
verted to a fault plane)
8. Next, copy or move the selection to a single new fault plane. This option is illustrated
in the above fault stick toolbar
9. Give a name in the text field of the toolbar
10. Hit the Go button to save and display the fault plane in the scene and tree
- 1050 -
l The 'Copy selection to' option is used to copy the selected fault sticks to a fault plane
without remove the fault sticks from the original fault stick set. Contrary to this, the
'Move selection to' option removes the selected fault sticks from the original fault stick
set and moves them to a fault plane.
l The Fault/FaultStickSet option is used to convert the selected fault sticks to a fault
plane or to another fault stick set.
l There are different ways to name faults/FaultStickSets. This is done via the output
operations list box i.e. Create single new (to create a new single fault plane or a fault
stick set), Create new in series (automatically labels the faults with a numeric index),
Merge with existing (to merge the fault plane to an existing fault plane or fault stick
set), and Replace the existing (replace the selected fault plane with the newly selec-
ted sticks).
l The trash button in the toolbar is used to remove the selected fault sticks
Supported Platforms
Officially the Wacom Cintiq 21UX and Wacom Cintiq 24HD are only supported on
Windows and Mac.
There is an Open Source group that writes and maintains drivers for the Wacom
tablets.
For more information please go to this the Sourceforge Linux Wacom page.
- 1051 -
15 Appendix E - Synthetic Data
Generation
l Ray Tracing
l Computation of the Zero Offset Reflection Coefficient
l Computation of the Reflection Coefficient at any non-zero offset
l Elastic Model
- 1052 -
Here various types of synthetic data can be generated: Zero Offset Stack, Pre
Stack gathers, Angle Stack and AVO Gradient:
- 1053 -
Ray tracing
The ray is going directly from the source to the depth of the target layer, and up to
the receiver in the same way. This does not account for ray bending, or velocity
inversions. Here the user has to specify the offset range and the step for creating
pre stack gathers; they could in theory be same as defined in acquis-
ition/processing of the seismic data. It can model both Downgoing and Upgoing P-
waves and S-waves. Now, ray tracer and Zoeppritz equations have produced
angle-dependent or offset-dependent reflectivity traces, which can be convolved
with user defined wavelet to produce pre stack gathers. It may be noted that in Syn-
thRock, the conversion from offset domain to angle domain and vice-versa is done
using the Vp of the Elastic Model [hyperlink with Elastic Model] (which is essen-
tially the upscaled and time converted Vp log of pseudo-wells).
- 1054 -
This works in a more sophisticated way than the simple ray tracer. It honours the
ray bending according to Snell's law and thus velocity inversions as well. To
reduce the processing time, the Elastic Model layers may be blocked: Consecutive
layers with similar Vp, Vs and Density values are concatenated together, as
defined by the threshold. For example the default threshold is 1%, which means if
there is less than 1% difference in the elastic model values of two layers, they will
be blocked. The ray is propagated in a straight line inside a concatenated layer.
It is also possible to compute internal multiples in the advanced ray tracer. Fur-
thermore, incorporation of spherical divergence, is also possible, by defining the
spreading geometry as either "Distance" or "Distance *Vint" .
Afterward, NMO corrections can also be applied to create NMO corrected synthetic
gathers. Here in the Advanced options, one can specify the % stretch mute typ-
ically applicable at far offsets. If the length of a full seismic waveform increases by
more than the mute %, it will get muted. Moreover, the taper-length of the muting
function, can be defined under this advanced options menu of NMO corrections:
- 1055 -
Advanced RayTracer: Advanced corrections options
Computation of the Zero Offset Reflection Coefficient For the simplest Zero Off-
set Stack, calculation of reflection coefficient at any interface is done using the
simple formula:
where Z 1 and Z 0 are the impedance of the top and bottom layers, respectively.
These layers are basically upscaled and time converted version of various logs
(Rho, Vp and Vs) in pseudo-well models, and as such comprise the Elastic Model
[hyperlink with Elastic Model] for synthetic seismic generation. The upscaling is
done using the Backus averaging algorithm in depth, but at a (variable) depth
sampling rate which is equivalent to the seismic sample rate in time. Depth-to-time
conversion of the pseudo- well logs, is done using the velocity model of the
pseudo-wells itself.
Backus upscaling is done only for Vp, Vs and Density logs (and other logs based
on them e.g. AI, LambdaRho, MuRho etc.). All other logs e.g. Phi, Sw etc. are
upscaled using thickness weighted averaging (i.e. weights used for the averaging
are the thicknesses of various pseudo-well layers) and are afterwards converted
into time (using the velocity model of the pseudo-wells), at survey sample rate. A
Nyquist filter, as defined by the survey sample rate is also applied on these time
converted rock property traces; e.g. if seismic survey sampling is at 4 ms, Nyquist
- 1056 -
filter will allow a maximum frequency of 125Hz. These are accessible to user in
real-time on the Variable Density View:
This reflectivity trace is then convolved with a user defined wavelet, to create the
Zero Offset Stack for all the pseudo-well models:
- 1057 -
Computation of the Reflection Coefficient at any non-zero offset Pre Stack
data (i.e. offset gathers) can be generated in OpendTect using full Zoeppritz equa-
tions and ray tracing (simple or advanced).
Full Zoeppritz equations are used to compute angle-dependent reflectivity from the
elastic model (i.e. upscaled and time converted version of various logs (Rho, Vp
and Vs) from pseudo-wells) at various interfaces as:
- 1058 -
(above images are from Wikipedia)
- 1059 -
Elastic Model
This Elastic Model can be accessed by clicking the icon, just left of 'Wavelet'.
This model is required by OpendTect for generating synthetic seismic data (both
zero offset stack and pre stack gathers). The elastic model essentially tells the soft-
ware which quantities to use for the reflection coefficient computation and ray tra-
cing, in terms of Density, Vp and Vs:
- 1060 -
If "Compute from: Defined quantity" is chosen, OpendTect can use appropriate
(upscaled and time converted) quantities from pseudo-wells. User can also chose
to compute missing quantities (not modeled in pseudo-wells) using pre-filled rock-
physics relations, e.g. Vs from Vp using Castagna's equation:
- 1061 -
16 Appendix F - MATLAB Link
Plugin for OpendTect v6.4
- 1062 -
16.1 Background
The MATLAB Link plugin has been developed to make a connection between
OpendTect and the MATLAB toolbox. Any MATLAB program can now be directly
run on seismic attribute volumes available in an OpendTect project. There are two
ways to use the link.
The first method uses OpendTect’s GUI (OpendTect’s main menu > Analysis >
Volume Builder) to access programs written in MATLAB. This facilitates on-the-fly
testing of various parameters defined in the MATLAB program using different input
seismic attribute volumes. The results are visualized in real-time in OpendTect
along various sections. Once the parameterization is done, a full 3D volume can
be generated. You can also augment your MATLAB functions with numerous other
tools available in the volume builder, e.g. the Voxel Connectivity Filter.
This method requires you to first compile the MATLAB code into one C shared lib-
rary by using the MATLAB Compiler. In addition to your own function, 2 additional
functions are needed, od_ getparameters() and od_ doprocess(). OpendTect will
call these functions to get information about any input parameters and to start the
execution of the function.
The advantage, however, is that these compiled libraries can be freely shared with
anyone. It makes it possible to run the MATLAB code in OpendTect on a system
that does not have the MATLAB software installed.
In case you do not want to use the OpendTect GUI for data exchange with
MATLAB, there is an alternative option using the binary MEX files. They can be
used to read and write OpendTect's native CBVS files from MATLAB itself. This
option obviously requires both MATLAB and OpendTect to be installed on the
same system.
The GUI based option to access MATLAB through the Volume Builder
(OpendTect’s main menu > Analysis > Volume Builder) works on 3D data only.
- 1063 -
16.2 MATLAB Versions/Platforms
The plugin is available in the Linux64 and Windows64 versions of OpendTect.
MATLAB R2013a has been used to build the libraries and is therefore the recom-
mended version to build the shared libraries. Versions of MATLAB older than
R2013a may not work with this plugin.
- 1064 -
16.3 Option 1: Usage Through
OpendTect's GUI
In order to access the MATLAB programs through OpendTect’s GUI, certain envir-
onment variables needed to be defined first. They are discussed below.
- 1065 -
16.3.1 Setting Up MATLAB Link
For both Linux* and Windows
Windows only
Linux* only
Set the LD_LIBRARY_PATH variable to the location of the MATLAB binaries and
the compiler binaries. Simply type the above command in your Linux terminal.
*The script odinit.matlab (provided in the root of the software installation) auto-
matically tries to set-up both the environment variables (MATLAB_DIR and LD_
LIBRARY_PATH). However, if the automatic application of the script fails, the user
should manually set- up the two environment variables in Linux as mentioned
above.
The user may also set the variable MATLAB_BUILDDIR to the prefered location of
the shared libraries. This path is used to set the default folder when browsing
shared libraries from OpendTect. It defaults to MATLAB_DIR.
- 1066 -
Control Panel > System > Advanced system settings > Advanced (tab) > Envir-
onment Variables
- 1067 -
Second environment variable: Edit the path variable and add following entry C:\Pro-
gramFiles\MATLAB\R2013a\bin\win64
- 1068 -
16.3.2 Example 1: MATLAB Function to Mul-
tiply
In this first simple example, a MATLAB function is created to multiply the values in
the input cube by a factor defined by the user. How to use it in OpendTect is
explained using the three functions od_getparameters(), od_doprocess()
and multiply() below. After creation, these three functions should be saved in
three separate .m files, which should have the same name as the function.
- 1069 -
16.3.2.1 OD Get Parameters
function pars = od_getparameters()
pars.nrinputs = 1;
pars.factor = 10;
end
This function returns one argument in the form of a structure array. A structure is a
data type that groups related data using data containers called fields. The field
nrinputs sets the number of input volumes.
OpendTect will only recognize the name 'nrinputs' and can therefor not be
changed.
The field factor is in this example used as the parameter for the volume mul-
tiplication. This field name will also be visible as a parameter name in the volume
builder GUI of OpendTect. The value 10 is a default value, which is changeable in
OpendTect’s GUI.
- 1070 -
16.3.2.2 OD Do Process
function out = od_doprocess(pars,in)
f = pars.factor;
end
This function has two input arguments, pars and in. The array in is essentially a
1 dimensional Cell Array which contains one or more data volumes. They can indi-
vidually be retrieved by using the cell2mat function.
- 1071 -
16.3.2.3 Multiply
function out = multiply(in1,f)
out = in1*f;
end
- 1072 -
16.3.3 Example 2: MATLAB Function to Sub-
tract
In this second simple example a MATLAB function to subtract two volumes is cre-
ated. Again the three required functions are listed and explained below, just like
the first example.
- 1073 -
16.3.3.1 OD Get Parameters
function pars = od_getparameters()
pars.nrinputs = 2;
end
As explained in the first example, this function returns one argument in the form of
a structure array. As we want to subtract two volumes, nrinputs is 2. No addi-
tional parameters are needed.
- 1074 -
16.3.3.2 OD Do Process
function out = od_doprocess(pars,in)
end
No parameters are needed for subtraction of two volumes, so no fields are needed
from the pars matrix. The array in is again a cell array, from which the two input
arrays are retrieved using cell2mat.This function returns one array out, which is
a 3D array with the same dimensions as the arrays in in. In this example, out is
the result of the function subtract, which is the 3rd and last function to be defined.
- 1075 -
16.3.3.3 Subtract
unction out = subtract(in1,in2)
out = in1-in2;
end
- 1076 -
16.3.4 Compilation of MATLAB Functions
All three functions (saved in three separate .m files) have to be compiled into one
C shared library before it can be used by OpendTect. This can be done by the
MATLAB GUI or from the command line.
l Open MATLAB
l Click on APPS tab
l Click on MATLAB Compiler
l Create a Deployment Project
l Give name, for example libmultiplyexample.prj
l Click OK
Back in main MATLAB window, you should see a new dockable window called C
Shared Library. Inside this window, click on 'Add files' under 'Exported Functions'
and select the three .m files. Finally click on the Build button. If you run this for the
first time, it might be necessary to run 'mbuild -setup' to locate the external c++ com-
piler. Windows users might need to install a Windows SDK. For more information
see System Requirements and Platform Availability.
The MATLAB Runtime is a standalone set of shared libraries that enables the exe-
cution of compiled MATLAB applications or components on computers that do not
have MATLAB installed. When used together, MATLAB, MATLAB Compiler, and
the MATLAB Runtime enable you to create and distribute numerical applications
or software components quickly and securely.
- 1077 -
1. Click the version and platform that corresponds to the application or component you
are using. Note: you can find this information in the readme.txt file that accompanies
the application or component.
2. Save the MATLAB Runtime installer file on the computer on which you plan to run the
application or component.
3. Double click the installer and follow the instructions in the installation wizard.
- 1078 -
16.3.5 Accessing Compiled MATLAB Func-
tions in OpendTect's GUI
Once the MATLAB functions are compiled into a C shared library they can be
accessed in OpendTect’s GUI. In OpendTect accessing the MATLAB compiled C
shared library (e.g. libname.so) is possible through the Volume Builder, wherein
two steps (illustrated below) need to be followed.
- 1079 -
- 1080 -
16.4 Option 2: Usage Through the MEX
Files in MATLAB Itself
In case you do not want to use the OpendTect GUI for data exchange with
MATLAB, there is an alternative option. OpendTect provides functions that can be
used in MATLAB to read/write arrays from/to an OpendTect datastore. The fol-
lowing setup is required:
APPL/bin/lux64/Release
l Windows: add $DTECT_APPL\bin\win64\Release to your system path
l Start MATLAB. Call the functions readcbvs() and writecbvs() in your MATLAB
routine, to read/write seismic attribute volumes (CBVS files) from/to OpendTect pro-
ject. Further, the writeSEG-Yindex() can be called to write the output as a SEG-Y
file.
Examples:
- 1081 -
Glossary
A
Absolute Impedance
Full-bandwith impedance inversion response in which the "missing" low-fre-
quency part of the spectrum has been added by the inversion method. For
example - in model-driven inversions the low-frequency model is typically cre-
ated by interpolating impedance well logs guided by mapped seismic horizons.
Accomodation Space
The available space for sediments to fill (measured from seafloor to base-level).
AI
Acoustic Impedance: the product of seismic velocity and density.
Attribute
An attribute is a derived quantity from a seismic input set. Attributes in OpendTect
are defined by a name, a value, and a position in 3D space (inline, cross-line and
Z (2WT or depth)). Attributes can be calculated from single-trace, multi-trace, and
multi-volume inputs. They can be steered and/or chained. Steered attributes are
multi-trace attributes in which the trace segments are found by following a (pre-
e-)calculated dip-azimuth. Chained attributes are attributes derived from other
attributes. For example, Similarity and Energy are separate attributes that can be
chained to calculate the Similarity of the Energy using the "Position" attribute.
Attribute Set
An attribute set is an entity consisting of a group of attributes. Usually attributes in
a defined attribute set have something in common. For example, all attributes in a
set have the potential to highlight an object type of interest, or a combined attrib-
ute, using all other attributes as intermediate results. This would be a desirable
output.
Base level
The surface at which sediment supply, relative sea level changes and wave
energy are in balance. This is the surface at which the accommodation space
equals zero: there is neither deposition, nor erosion.
- 1082 -
Body
A body is an element that defines an arbitrary three dimensional geological
shape (or a geo-body). The body can also be created manually or by using poly-
gons.
ChimneyCube
A volume that highlights vertical disturbances in seismic data. The cube is used
in fluid migration path studies, in prospect ranking and for fault seal analysis. A
ChimneyCube is generated by a neural network that was trained on picked
examples (chimneys and non-chimneys). It gives at every sample location the
"chimney probability" i.e. the likelihood of belonging to the class of identified seis-
mic chimneys.
Chrono-stratigraphy
A set of relative geologic time lines as stored in a HorizonCube.
CLAS
A plugin for petrophysical analysis. CLAS stands for Computer Log Analysis Sys-
tem.
Closed Source
Software that is released in binary form only. The commercial plugins to
OpendTect are released as closed source extensions. Such extensions are only
permitted if OpendTect is run under a commercial (or academic) license agree-
ment.
Color Blending
Combined display of three (four) attributes that are displayed in the Red Green
and Blue color channels. Optionally the fourth channel (alpha) displays trans-
parency. Color blending is aka as RGB (RGBA) blending.
Crossline Dip
Dip in the direction of the Crossline axis, or in the direction of increasing cross-
lines.
Dip-Steering
The process of auto-tracking seismic data by following the pre-calculated, local
dip and azimuth of the seismic. Dip-steering is used for: a) extracting seismic
- 1083 -
trace segments along seismic reflectors as input to multi-trace attribute cal-
culations, b) computing special attributes such as polar dip, azimuth, and volume
curvature attributes, c) filtering seismic data (known as dip-steered filtering, aka
structurally oriented filtering), and d) auto-tracking chrono-stratigraphic horizons
in the creation of a HorizonCube.
Dip-Steering Cube
A volume computed from seismic data with at every sample position information
about the local dip and azimuth of the seismic data. In a 3D Steering Cube this
information is stored in two attributes per sample: inline dip and cross-line dip.
On 2D seismic only one value is stored: the line-dip. Dips in a Steering Cube are
measured in the line direction and expressed in us/m or mm/m, for time and
depth data, respectively.
EEI
Extended Elastic Impedance. Scaled and rotated impedance response at a paar-
ticular angle. Rotation is typically optimized to predict a certain well log property
of interest.
EI
Elastic Impedance. Impedance response at a particular angle of incidence.
Element
An element is a sub-division of various items (of the tree) that are displayed in a
3D scene. Inline, crossline, timeslices, horizon, wells etc are some elements.
Each element is sub-divided into a sub-element. For instance an inline element
can have further sub-elements e.g. inline # 120 that can contain upto eight dif-
ferent attributes.
Eustatic sea-level
Sea-level relative to center of earth.
Explicit Representation
A representation of a 3D object in OpendTect in the form of a triangulated sur-
face.
- 1084 -
F
Fault Stickset
The faults are interpreted on a section as a stick, and all sticks that belong to one
fault are grouped in one sticksets. Therefore, a fault stickset contains an
unordered collection of the interpreted sticks.
Forced regression
Deposition characterized by progradation and incision. Base-level is falling
decreasing accomodation space, forcing the system to prograde. Forced regres-
sion occurs during the Falling stage systems tract.
GMT
An open source mapping package developed and maintained by the University
of Hawaii (http://gmt.soest.hawaii.edu/). GMT stands for Generic Mapping Tools.
GPL License
Gnu General Public License (http://www.gnu.org/licenses/gpl.html) is an open
source license under which OpendTect can be run. The license allows redis-
tribution of (modified) source code under the same licensing conditions (copy left
principle). It is not allowed to combine the open source part with closed source
plugins, which is why OpendTect is also licensed under a commercial license
agreement and under an Academic license agreement.
Horizon Data
It refers to a stored attribute grid in a horizon. An attribute is calculated on-the-fly
or in a batch process. On-the-fly, a user needs to store by right-clicking on it an
selecting Save attribute... option. The saved attribute can also be managed in the
Manage horizon. It may be noted that a horizon can contain unlimited stored
attribute/horizon data.
HorizonCube
A dense set of auto-tracked (or modeled) seismic horizons that is indexed and
sorted according to relative geologic time (= chrono-stratigraphy).
- 1085 -
I
Implicit Representation
A representation of a 3D object in OpendTect in the form of an iso-surface
through a cube of values.
Incision
Depositional feature caused by erosion.
Inline Dip
Dip in the direction of the Inline axis, or in the direction of increasing inline num-
bers.
Madagascar
An open source seismic processing package. See: http://en.wiki-
pedia.org/wiki/Madagascar_(software)
Meta Attribute
A meta-attribute is an attribute created from multiple input attributes. In
OpendTect, a meta attribute is created either through neural networks, or through
mathematical manipulations and/or logical operations. For example, TheChim-
neyCube and TheFaultCube are meta-attributes. See the Ridge enhancement fil-
ter attribute set from the Default attribute sets for an example of a meta-attribute
created through math and logic. The meta-attribute in this set is the last attribute
in the list.
MPSI
A plugin for stochastic acoustic impedance inversion. MPSI stands for Multi-Point
Stochastic Inversion.
- 1086 -
N
Normal regression
Deposition characterized by aggradation and progradation. The base level is
rising but the consumption of accommodation space by sedimentation exceeds
the creation of accommodation space by the base level rise. Normal regression
occurs during high stand and low stand systems tracts.
Open Source
Software that is released with its source code. OpendTect is released as open
source product that can be extended with closed source plugins. Such exten-
sions are only permitted if OpendTect is run under a commercial (or academic)
license agreement.
PDF
PDF is Probability Density Functions. In OpendTect these are created in the
cross-plot tool by selecting a desired area in the cross-plot domain. The density
of the points in the selected area is a measure for the probability of the desired tar-
get variable that can then be predicted by applying the derived PDF function to
(scaled) input volumes in a Bayesian classification scheme.
pointset
A pointset is a collection of picked locations, i.e. inline-crossline-Z information.
pointsets are part of a pointset Group. For example a pointset Group containing
points at fault locations may consist of different fault pointsets to differentiate
between large faults and small faults, or to reflect points on different inlines.
pointset Group
A pointset group is a collection of different pointsets. Usually pointsets are
grouped because they refer to the same object, e.g. Chimney_yes or Chimney_
no.
Regression
Seaward shoreline and facies shift. Regression can be Normal (base level rises)
or Forced (base level falls).
- 1087 -
Relative Impedance
Band-limited impedance inversion response computed by methods such as
colored inversion.
Relative sea-level
The net effect of eustatic sea level changes and local tectonic fluctuations.
Retrogradation
Depositional trend characterized by sediments building landwards aka back-step-
ping.
SEG-Y
A file format for exchanging seismic or seismic-like data. It is used for both 2D
and 3D pre- or poststack data. A file being SEG-Y compliant does not mean that
it can be loaded into OpendTect. There are several possible problems. One of
these is missing trace identification and/or positioning. Another issue is lack of
true compliance ->SEG-Y Rev 0, -> SEG-Y Rev 1). The different types of SEG-Y
are shown below: * SEG-Y Rev 0: The initial SEG-Y specification in 1975. It is
very precise in some areas but totally unspecified in other, crucial areas. This
has led to an almost uncountable number of variants. Some are sort-of SEG-Y
standard, others blatantly non-compliant. * SEG-Y Rev 1: In 2002 the Revision 1
document made an end to the most obvious shortcomings of ->SEG-Y Rev 0,
especially in the area of ->trace positioning and ->trace identification. Still many
SEG-Y files or files claimed to be SEG-Y are Rev 0 or badly (i.e. not) compliant
with Rev 1. This is why OpendTect has numerous options for the SEG-Y reading
process. * SEG-Y Textual header: The first 3200 bytes of a SEG-Y file must be
filled with textual comment on the contents of the SEG-Y file. Older textual head-
ers are encoded in EBCDIC rather than ASCII, which makes them impossible to
read in a standard text editor. * SEG-Y EBCDIC header: -> SEG-Y Textual
header. * SEG-Y Tape Header: The part of a SEG-Y file that gives information
about all traces in the file. This information is in the ->SEG-Y Textual header and
->SEG-Y Binary header. * SEG-Y Binary header: The second part of the SEG-Y
Tape header contains binary information about, amongst others, values for num-
ber of samples per trace, byte encoding, sample interval, and SEG-Y Revision. *
Trace identification: Every trace in OpendTect needs to have an identification in
form of a trace number (2D data) or inline/crossline (3D data). For prestack data
the offset forms and extra trace identification. * Trace positioning: In OpendTect,
every seismic trace needs to be located in 3D space. For 3D data, the position
can be derived from the->Trace identification (inline- and crossline numbers).
- 1088 -
Traces in 2D lines have their own, separate X- and Y- coordinate. For prestack
data there must also be an offset available.
SSIS
A plugin to perform a sequence stratigaphic analysis (systems tracts, Wheeler
transforms) from seismic data using HorizonCube input. SSIS stands for
Sequence Stratigraphic Interpretation System.
Stratal Slicing
The process of cutting through a seismic volume along surfaces that are com-
puted proportionally between mapped top and bottom horizons, aka proportional
slicing.
Systems Tracts
Subdivisions of sequences that consist of discrete depositional units that differ in
geometry from other systems tracts and have distinct boundaries on seismic data.
Different systems tracts are considered to represent different phases of baselevel
changes.
Trace Identification
Every trace in OpendTect needs to have an identification in form of a trace num-
ber (2D data) or inline/crossline (3D data). For prestack data the offset forms and
extra trace identification.
Trace Positioning
In OpendTect, every seismic trace needs to be located in 3D space. For 3D data,
the position can be derived from the->Trace identification (inline- and crossline
numbers). Traces in 2D lines have their own, separate X- and Y- coordinate. For
prestack data there must also be an offset available.
Transgression
Landward shoreline and facies shift characterized by aggradation and ret-
rogradation. The base-level is rising and more accommodation space is created
than is consumed by sedimentation
Tree
The tree is the docking window, which is detachable and movable. This is used
to display the data into a scene. The tree is attached to a scene and is labeled as
Tree Scene 1. Where '1' is the scene number. Each tree has its own elements
that are displayed in corresponding scene.
- 1089 -
U
VMB
A plugin for picking velocities from semblance gathers, and in a surface-con-
sistent-manner. VMB stands for Velocity Model Building.
Waveform Segmentation
Process of clustering seismic trace segments with a UVQ network along a
mapped horizon into a user-defined number of clusters.
WCP
A plugin to pick and QC well log markers with the help of seismic data and
(optionally) the HorizonCube. WCP stands for Well Correlation Panel.
Wheeler Transform
Process of flattening seismic data (or attributes) according to the chrono-strati-
graphic horizons in a HorizonCube. In a Wheeler scene the vertical axis rep-
resent relative geologic time.
- 1090 -