063 Hycom Users Guide
063 Hycom Users Guide
(HYCOM)
Version 2.1
Users Guide
A. Wallcraft
Naval Research Laboratory
S.N. Carroll, K.A. Kelly, K.V. Rushing
Planning Systems Incorporated
CONTENTS
Contents
1 Introduction
2
2
2
2
4 Operating Guidelines
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
8
8
10
10
11
11
12
6 Building a Bathymetry
6.1 Bathymetry File Naming Convention . . . . . . .
6.2 Example IASb0.50: Creating a New Bathymetry
6.2.1 Steps for Generating New Bathymetry . .
6.3 Using MICOM Bathymetry Files . . . . . . . . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
13
14
14
15
15
.
.
.
.
17
18
19
19
20
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
Domain
. . . . . .
. . . . . .
. . . . . .
. . . . . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
22
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
23
24
24
24
25
26
26
ii
CONTENTS
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
27
27
27
29
29
29
30
30
30
34
.
.
.
.
36
36
36
38
38
39
39
39
40
40
41
42
42
43
43
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
44
45
46
46
46
47
47
47
47
48
48
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
CONTENTS
18 Sampling Transport
18.1 Sampling Transport . . .
18.1.1 BARO VEL . . . .
18.1.2 TRANSPORT . .
18.1.3 TRANSP MN . . .
18.1.4 TRANSP MN 2P0
18.1.5 MERGETSPT . .
18.1.6 MEANTSPT . . .
iii
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
49
49
49
49
49
50
50
50
19 Nesting in HYCOM
19.1 Nesting at Different Horizontal Grid Resolutions . .
19.1.1 Input Files . . . . . . . . . . . . . . . . . . .
19.1.2 Setting the Resolution of the Nested Domain
19.1.3 Creating Sub-region Bathymetry . . . . . . .
19.1.4 Generating IASb0.50 Archive Files . . . . . .
19.2 Nesting at the Same Horizontal Resolution . . . . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
51
51
51
51
52
52
54
.
.
.
.
.
55
55
56
56
57
57
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
20 Parallel Processing
20.1 Configuring the Run Script . .
20.2 patch.input . . . . . . . . . . .
20.3 Generating Equal-Ocean Tile
20.4 Comparing Runs . . . . . . . .
20.4.1 Pipe.f . . . . . . . . . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
. . . . . .
. . . . . .
Partitions
. . . . . .
. . . . . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
58
60
23 TECHNICAL REFERENCES
23.1 HYCOM Software Documentation . . . . . . . . . . . . . . . . . . . . . . . . . .
23.2 General Technical Documentation . . . . . . . . . . . . . . . . . . . . . . . . . .
61
61
61
24 Acronyms
62
25 APPENDIX A
25.1 HYCOM Utility Commands . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
63
63
26 APPENDIX B
26.1 blkdat.input Model Input Parameters . . . . . . . . . . . . . . . . . . . . . . . .
67
67
27 Appendix C
27.1 Sample Input File for Plotting - 990 cs2.IN . . . . . . . . . . . . . . . . . . . . .
70
70
iv
LIST OF TABLES
List of Figures
1
2
3
4
5
6
7
8
9
HYCOM mesh. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Grid specification parameters in regional.grid.com . . . . . . . . . . . .
Contents of the file regional.grid.[ab] . . . . . . . . . . . . . . . . . . . .
Flow chart of atmospheric forcing interpolation to HYCOM grid scheme. .
Example of COADS wind file summary provided by wind stat. . . . . . .
Example COADS output file tauewd.a. . . . . . . . . . . . . . . . . . .
Example of vertical structure parameters in blkdat.input. . . . . . . . .
Flowchart of climatology interpolation to HYCOM grid. . . . . . . . . . .
Make suffix rules for creating object files. . . . . . . . . . . . . . . . . . .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
6
8
9
17
18
21
22
23
29
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
3
4
10
11
12
13
14
19
20
24
25
28
28
31
32
32
33
42
42
44
45
63
64
65
66
67
68
69
List of Tables
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
22
22
22
23
23
23
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
Introduction
The Hybrid Coordinate Ocean Model (HYCOM; (Halliwell et al., 1998, 2000; Bleck, 2001)
was developed to address known shortcomings in the vertical coordinate scheme of the Miami
Isopycnic-Coordinate Ocean Model (MICOM) developed by Rainer Bleck and colleagues. HYCOM is a primitive equation, general circulation model with vertical coordinates that remain
isopycnic in the open, stratified ocean. However, the isopycnal vertical coordinates smoothly
transition to z-coordinates in the weakly stratified upper-ocean mixed layer, to terrain-following
sigma coordinates in shallow water regions, and back to z-level coordinates in very shallow water.
The latter transition prevents layers from becoming too thin where the water is very shallow.
The HYCOM user has control over setting up the model domain, generating the forcing
fields, and ingesting either the climatology or output fields from other model simulations to use
for boundary and interior relaxation. The model is fully parallelized and designed to be portable
among all UNIX-based systems.
An important goal in developing HYCOM was to provide the capability of selecting from
several vertical mixing schemes for the surface mixed layer and comparatively weak interior
diapycnal mixing. The K-Profile Parameterization (KPP, Large et al., 1994; 1997) algorithm
was included as the first non-slab mixed layer model because it provides mixing throughout the
water column with a transition between the vigorous mixing in the surface boundary layer and
the weaker diapycnal mixing in the ocean interior. KPP works on a relatively coarse and unevenly
spaced vertical grid, and it parameterizes the influence of a suite of physical processes larger than
other commonly used mixing schemes. In the ocean interior, the contribution of background
internal wave breaking, shear instability mixing, and double diffusion (both salt fingering and
diffusive instability) are parameterized. In the surface boundary layer, the influences of winddriven mixing, surface buoyancy fluxes, and convective instability are parameterized. The KPP
algorithm also parameterizes the influence of nonlocal mixing of temperature (T) and salinity
(S), which permits the development of counter gradient fluxes.
Three additional mixed layer models have been incorporated into the HYCOM version 2.1
code: 1) the dynamical instability model of Price et al. (1986), 2) the Mellor-Yamada level
2.5 turbulence closure scheme used in the Princeton Ocean Model (POM; Mellor and Yamada,
1982; Mellor, 1998), and 3) the Kraus-Turner (KT) slab model. Other mixed layer models will
be included in the near future, such as a turbulence model developed recently by Canuto (2000).
HYCOM versions 1.0 and 2.0 have previously been released as a result of collaborative
efforts between the University of Miami, the Los Alamos National Laboratory, and the Naval
Research Laboratory (NRL). Ongoing HYCOM research has been funded under the National
Oceanographic Partnership Program (NOPP) and the Office of Naval Research (ONR).
This manual describes in detail the procedures for running the Hybrid Coordinate Ocean Model
Version 2.1 (HYCOM). HYCOM is set up to be domain independent, and all components except
the model code will be compiled only once. The model script is configured to allow data files
(input and output) to be resident on a different machine (e.g., an archive system). The actual run
is from a scratch directory, and files are copied from scratch to permanent storage (possibly on
another machine) using whatever commands are associated with the pget and pput environment
variables. If everything is on a single machine (or if the archive directory is NFS mounted on
the run machine), pget and pput can both be cp (e.g., setenv pget /usr/bin/cp). Otherwise,
the ALL/bin directory contains several examples of appropriate pput and pget commands.
A separate technical manual, the HYCOM Users Manual, compliments this document and
contains the mathematical formulation, solution procedure, and code of the model as well as
flow charts and descriptions of the programs and sub-programs (Wallcraft et al., 2002).
2.1
Running Environment
2.2
Directory Structure
HYCOM 2.1 has been designed so that all of the pre- and post-processing programs are compiled
only once. The directory hycom ALL contains all of the domain-independent pre- and postprocessing programs. A second directory contains the data files and scripts needed to run a
simulation for a specific domain. For example, everything needed to process and run HYCOM
on the Atlantic 2.00 degree domain is found in the hycom ATLb2.00 directory. These files are
used in conjunction with the programs found in the ALL directory to run a simulation. A
description of the subdirectories and their contents are listed in Tables 1 and 2 on the following
pages.
2.3
Almost all HYCOM model input and output uses the standard HYCOM .[ab] format. The
.a file contains idm*jdm 32-bit IEEE real values for each array, in standard fortran element
order, followed by padding to a multiple of 4096 32-bit words (16K bytes), but otherwise with
no control bytes/words, and input values of 2.0**100 indicating a data void. Each record is
padded to 16K bytes to potentially improve I/O performance on some machines by aligning
record boundaries on disk block boundaries.
restart
sample
subregion
topo
plot ncl
README.ALL
relax
plot
Source code for modifying HYCOM archive files, and converting them to other file formats.
Utilities this directory should be in users path.
Machine and parallelization-specific part of makefiles.
Source code for interpolation of NRL format wind/flux files on native grid to HYCOM
model grid.
Common source files.
Source code for forming the mean or mean-squared or standard deviation of a sequence of
archive files.
Source code for plotting HYCOM archive files and 2-D fields from any HYCOM .a file,
using National Center for Atmospheric Research (NCAR) graphics.
archive
bin
config
force
libsrc
meanstd
Contents
Subdirectory
2.3
I/O File Formats in HYCOM
3
Scripts for reading HYCOM archive files and writing archive/data/netCDF/restart files.
Machine and parallelization specific part of makefiles.
Documentation.
Old example simulation, hybrid vertical coordinate.
Data files for example simulation expt 01.0.
Latest example simulation, hybrid vertical coordinate.
Data files for example simulation expt 01.5.
archive
config
doc
expt 01.0
data
expt( )01.5
data
examples
force
coads
offset
plot
meanstd
plot
relax
010
999
levitus
plot
sample
src 2.0.01 22 one
src 2.1.00 22 one
test
subregion
topo
partit
topo IASd0.50
Atmospheric forcing data files.
Generate the COADS forcing files.
Wind offset files and scripts.
Plot the forcing files.
Mean and standard deviation of HYCOM surface archived fields.
Example of plotting and scripts.
Relaxation data files and scripts.
Isopycnal climatology for a HYCOM simulation.
Zonal isopycnal depth climatology for HYCOM subroutine POFLAT.
LEVITUS climatology on HYCOM horizontal grid.
Plot climatology.
Transport across sections from archive files
Source code HYCOM version 2.0.01
Source code, HYCOM version 2.1 for 16 layers and Message Passing Interface (MPI).
Programs to test that individual communication routines are working.
Interpolate bathymetry/archive to new grid/sub-domain.
Grid and bathymetry data files and scripts.
Partition text file for domain decomposition into tiles.
Contents
Subdirectory
4
2
DESCRIPTION OF HYCOM USAGE
2.3
The associated small .b file is plain text, typically containing a 5-line header followed
by one line for each 2-D field in the .a file. The format of the per-array line varies, but
it typically use = to separate an array description from trailing numeric values. It is good
practice to confirm that the minimum and maximum values from .a and .b files agree on
input. The best way to read and write .a files is using the standard set of ZAIO routines
provided in mod za.F, see Section 15. The ocean model has separate versions of these routines
for distributed memory and shared memory machines, but pre and post-processing programs use
the (shared memory) version in the subdirectory libsrc in the ALL directory. There are many
utilities that act on .a files in the bin subdirectory (see Appendix A for a list and description),
and almost any program in the ALL subdirectories can act as an example of how to use the
ZAIO routines.
The only other HYCOM model input is via, *.input, plain text files. The format of these
varies, but a common format for a single line of text is a data value followed by the six-character
name of the variable in quotes followed by a comment (which is ignored on input). Such lines
can be read by the blkin[ilr] routines (for integer, logical, real input). These subroutine names
come from blkdat.input, which uses this format.
The primary HYCOM model output is archive files, which are in .[ab] format and contain
all prognostic variables for one time step (or just the surface, depth averaged and layer 1 fields).
These can be converted to other file formats, including netCDF, as a post-processing step (see
Section 17).
HYCOM uses the C grid that was originally used in MICOM although HYCOMs horizontal
mesh differs from the one used in MICOM. In the MICOM mesh, the positive x direction is
southward and the positive y direction is eastward. The HYCOM mesh was converted to standard Cartesian coordinates, with the x-axis pointing eastward and the y-axis pointing northward.
The HYCOM mesh is illustrated below (Figure 3) for pressure (p), velocity component (u and
v), and vorticity (Q) grid points. This case is for 7 7 pressure grid points. The grid meshes for
the other variables have 8 8 grid points. All fields in this case would therefore be dimensioned
8 8, with the eighth row and column unused for variables on pressure grid points.
Operating Guidelines
There are several key aspects of setting up HYCOM for a new domain and running a simulation.
The basic steps are:
1. Choose a domain and resolution.
2. Build a bathymetry.
3. Interpolate atmospheric forcing to the domain.
4. Choose vertical (isopycnal) structure (in expt ##.#/blkdat.input).
5. Interpolate T/S climatology to the model domain and the chosen vertical structure.
6. Configure and compile the model code.
7. Complete configuration of expt ##.#.
8. Run the simulation.
9. Plot and analyze results.
A new simulation on the same domain typically repeats steps 7 9, and if the vertical structure changes, steps 4, 5, and 7 are also repeated. The model code will only need reconfiguring
if the number of layers changes.
The standard HYCOM version 2.1 model has been set up to run for the 2.00 degree Atlantic
Ocean domain (ATLb2.00). The file regional.grid.[ab], which specifies the parameters for
the model grid domain, has been provided in the ATLb2.00/topo subdirectory. This file is read
at run time by all of the pre- and post-processing programs, so if running HYCOM for a new
region, this file will need to be generated. In addition, the file dimensions.h will need to be
modified. If running HYCOM on the ATLb2.00 domain, then no changes need to be made to
these files.
To set up HYCOM for a new stand-alone region, the first step is to create a new directory,
analogous to the ATLb2.00 directory and subdirectories. To do this, the user must pick a
region name, in the format XXXaN.NN(e.g., IASb0.50, ATLb2.00, ATLd0.32). XXX is an
uppercase three letter primary region name, a is a lowercase letter secondary region name,
and N.NN is a three digit grid resolution description. Once the new region directory and
subdirectories have been created, the next step is to create the regional.grid.[ab] files in the
XXXaN.NN/topo subdirectory to describe the location of the new region and grid.
5.1
File regional.grid.[ab]
All HYCOM pre- and post-processing programs read regional.grid.b at run-time to get the
longitudinal array size (idm) and the latitudinal array size (jdm) for the particular region being
processed. So the script regional.grid.com must be run first to generate regional.grid.[ab]. The
source code that is called by the script regional.grid.com is domain-independent and located in
the topo subdirectory under the ALL directory. The script itself, regional.grid.com, is located
in the topo subdirectory under the ATLb2.00 directory.
In the case of ATLb2.00, the grid is mercator, which is a constant longitudinal grid
spacing in degrees but latitudinal grid spacing in degrees varying with cos(latitude) to give
a square grid cell in meters. The method of specifying the grid location is as used previously by MICOM and earlier versions of HYCOM. The user must set the grid specification
parameters in regional.grid.com (see Figure 2). When run, regional.grid.com calls the program
GRID MERCATOR, which creates the grid definition file.
0
1.0
263.0
2.0
11.0
0.0
2.0
mapflg
pntlon
reflon
grdlon
pntlat
reflat
grdlat
=
=
=
=
=
=
=
5.1
File regional.grid.[ab]
57
52
0
plon:
plat:
qlon:
qlat:
ulon:
ulat:
vlon:
vlat:
pang:
pscx:
pscy:
qscx:
qscy:
uscx:
uscy:
vscx:
vscy:
cori:
pasp:
idm
= longitudinal array size
jdm
= latitudinal array size
mapflg = map flag (-1=unknown,0=mercator,2=uniform,4=f-plane)
min,max =
263.00000
375.00000
min,max =
-19.60579
63.11375
min,max =
262.00000
374.00000
min,max =
-20.54502
62.65800
min,max =
262.00000
374.00000
min,max =
-19.60579
63.11375
min,max =
263.00000
375.00000
min,max =
-20.54502
62.65800
min,max =
0.00000
0.00000
min,max =
100565.21875
222389.87500
min,max =
100571.95312
222378.59375
min,max =
102139.75781
222356.01562
min,max =
102146.85938
222344.73438
min,max =
100565.21875
222389.87500
min,max =
100571.95312
222378.59375
min,max =
102139.75781
222356.01562
min,max =
102146.85938
222344.73438
min,max =
-0.0000511824
0.0001295491
min,max =
0.99993
1.00005
10
Three header lines in regional.grid.b identify the domain size and the map projection (which
is ignored for most purposes). This is followed by one line for each field in regional.grid.a: (i)
the longitude and latitude of all four grids, (ii) the angle of the p grid with respect to a standard
latitude and longitude grid, (iii) the grid spacing in meters of all four grids, (iv) the Coriolis
parameter (q-grid), and (v) the aspect ratio of the p grid (pasx/pscy).
5.2
When choosing a new domain or changing the number of layers, the user must alter the source
code file dimensions.h or select one of the dimensions.h files already made available in
HYCOM in the ALTb2.00/src * * subdirectory. There are several example versions for different
regions available in HYCOM from which the user can choose (see Table 3). Typically, the
omp OpenMP version of dimensions.h is appropriate for a single processor and the ompi
OpenMP+MPI version is used for any distributed memory configuration (MPI only, SHMEM
only, or MPI+OpenMP). To use one of these files, the user must copy the appropriate version
to dimensions.h. Alternatively, the user can create a version for a new region by altering the
parameters in dimensions.h. These user-tunable parameters and their descriptions are listed in
Table 4.
File
dimensions
dimensions
dimensions
dimensions
dimensions
dimensions
5.2.1
ATLa2.00 omp.h
ATLa2.00 ompi.h
ATLd0.32 omp.h
ATLd0.32 ompi.h
JESa0.18 omp.h
JESa0.18 ompi.h
2.00
2.00
0.32
0.32
0.18
0.18
degree
degree
degree
degree
degree
degree
Grid dimensions
In order to change the region size or the number of layers, the user can change the parameters
itdm, jtdm, or kdm in dimensions.h. The default values for these parameters have been set to
a total grid dimension of 57 by 52, and 16 vertical layers for the Atlantic 2.00 degree domain.
The user must create a new source code directory and executable every time the values of these
parameters are changed. In addition, the user must update the regional.grid.b file that is
used to define the region to setup programs so that it is consistent with dimensions.h.
If memory is plentiful, then kkwall, kknest, kkmy25 can all be set to kdm. However, if
memory is in short supply then kwall and/or kknest can be set to 1 if wall or nest relaxation
is not being used. The parameter kkmy25 can be set to -1 if the Mellor-Yamada mixed layer is
not being used.
5.2
11
5.2.2
mxthrd
The parameter mxthrd in dimensions.h is only important when using OpenMP (TYPE = omp or
ompi). OpenMP divides each outer (i or j) loop into mxthrd pieces. Set mxthrd as an integer
multiple of the number of threads (omp num threads) used at run time (i.e., of NOMP), typically
set as jblk = (jdm+2*nbdy+mxthrd-1)/mxthrd ranges from 5-10. For example, mxthrd =
16 could be used with 2, 4, 8 or 16 threads. Other good choices are 12, 24, 32, etc. Large
values of mxthrd are only optimal for large idm and jdm. For TYPE = omp, use the command
bin/hycom mxthrd to aid in selecting the optimal mxthrd. It prints out the stripe size (jblk) and
load-balance efficiency of all sensible mxthrd values. Set mxthrd larger than omp num threads to
give better land/sea load balance between threads. The directives have not yet been extensively
tuned for optimal performance on a wide range of machines, so please report cases to Alan
Wallcraft where one or more routines scale poorly. Also send in any improvements to the
OpenMP directives.
A separate source code directory and executable are required for each parallelization strategy,
or TYPE, chosen by TYPE = one, omp, ompi, mpi, or shmem. The TYPE also affects how dimensions.h
is configured.
5.2.3
Dimensioning tiles
When running on a shared memory machine (TYPE = one or omp), set the parameters iqr =
jqr = 1, idm = itdm, and jdm = jtdm. Note that the same OpenMP executable (TYPE = omp)
may be used for a range of processor counts, provided mxthrd is chosen appropriately. When
running on a distributed memory machine (TYPE = mpi or ompi or shmem) set iqr and jqr to
the maximum number of processors used in each dimension, and idm and jdm to the maximum
(worse case) dimensions for any single tile on any targeted number of processors. Note that the
same executable may be used for a range of processor counts, provided iqr, jqr, idm, and jdm
are large enough for each case.
12
5.3
poflat.f
If the user changes regions, the file poflat.f must also be modified for the particular region. The file poflat.f defines the zonal climatology for the initial state when iniflg = 1
(blkdat.input parameter; Appendix B). Example versions already available in HYCOM for
different regions are listed in Table 5.
Note that this routine does not depend on grid resolution. Copy the appropriate version
to poflat.f, or create a version for a new region. There are examples of how to create a new
version in the directory relax. All input bathymetry, forcing and boundary relaxation files are
also region specific and are selected at run time from blkdat.input.
File
poflat
poflat
poflat
poflat
poflat
poflat
ATLa.f
ATLd.f
JESa.f
F1Da.f
F2Da.f
SYMa.f
Atlantic to 65N.
Atlantic to 70N, including the Mediterranean.
Japan/East Sea.
1-D test case (same profile at all latitudes.)
2-D upwelling case (same profile at all latitudes).
3-D symmetry case (same profile at all latitudes).
13
Building a Bathymetry
HYCOM 2.1 provides the bathymetry files for the Atlantic 2.00 domain and the scripts that
were used to generate these files in the ATLb2.00/topo subdirectory (See Table 6). If the user
wants to generate bathymetry files for other regions, the bathymetry on the HYCOM grid can
be generated in one of three ways:
1. Data sets from the Earth Topography 2 (ETOPO2) Global Earth Topography from the
National Geophysical Data Center (NGDC) can be obtained and used to generate the
bathymetry. This data can be accessed at http://dss.ucar.edu/datasets/ds759.1/.
2. A bathymetry file for the Atlantic 2.00 degree region can be copied. Since filenames include
the region name, the script new topo.com is provided to copy scripts from one region to
another.
3. A new bathymetry file can be generated on the HYCOM grid by using the program
TOPINT and the 5-minute TerrainBase data set. The TOPINT program is located in
the file bathy 05min.f in the subdirectory topo. The Terrain Base data set is available at
ftp://obelix.rsmas.miami.edu/awall/hycom/tbase for hycom.tar.gz.
If the bathymetry is being created for a new region, the newly generated bathymetry files
and landsea masks should be placed in the XXXaN.NN/topo subdirectory that was created for
the new region.
File
depth.51x56
depth ATLa2.00 01.[ab]
depth ATLa2.00 01.com
depth ATLa2.00 01.log
14
6.1
BUILDING A BATHYMETRY
Depth files include the region name (i.e., ATLa2.00) so that files from several regions may be
collected in one directory. The ending 01 indicates version 01 of the bathymetry, and this convention allows for up to 99 distinct bathymetries for the same region and resolution. For example,
the Atlantic 2.00 degree bathymetry files, depth ATLa2.00 02 and depth ATLa2.00 03, have
the same land or sea boundary as depth ATLa2.00 01 but 02 applies a 9-point smoother to the
01 depths and 03 has a flat bottom at 5000 m. Additional smoothing, as in depth ATLa2.00 02,
may not be necessary at two-degree resolution but one or two smoothing passes may be appropriate when using higher resolution.
6.2
An example dataset and scripts have been provided in HYCOM to demonstrate how to create
a new bathymetry (Table 7). The dataset is a subset of the Intra Americas 0.50 degree domain
based on the 5 minute global TerrainBase data set. The version used here has been extended
by 5 degrees across the N and S poles to simplify interpolation near the poles. In this example,
depth IASb0.50 01 is a raw bathymetry that is not used in the model run, but it becomes
depth IASb0.50 02 after editing. Additional smoothing, as was used in depth ATLb2.00 02, may
not be necessary at 2 degree resolution but one or two smoothing passes might be appropriate
when using higher resolution. The script new topo.com is provided to copy scripts from one
region to another, since filenames include the region name.
File
6.3
6.2.1
15
To generate the new bathymetry on the HYCOM grid, the user must follow these steps:
1. Obtain a bathymetry dataset (Example: depth IASb0.50 01.[ab]).
2. Run the following scripts:
a) Run regional.grid.com. All programs read regional.grid.b at run-time to get idm and
jdm for the particular region being processed. So regional.grid.com must be run first
to generate regional.grid.[ab].
b) Run depth IASb0.50 01.com to interpolate the 5-minute bathymetry to HYCOM
bathymetry.
c) Run landsea IASb0.50.com to interpolate the 5-minute bathymetry to the HYCOM
land or sea mask.
d) Run depth IASb0.50 01 map.com (choose landmask for 02).
e) Run depth IASb0.50 02 landmask.com. A landmask is optional, but is used to
distinguish between the model land or sea boundary (e.g., the 20m isobath) and the
actual coastline (at least to the limits of the grid used) on plots. It isnt necessary
unless you are using the NCAR graphics-based HYCOMPROC and FIELDPROC.
f ) Run depth IASb0.50 02 map.com to map the HYCOM bathymetry (choose landsea modifications).
g) Run landsea IASb0.50 modify.com to modify the HYCOM land or sea mask.
Some steps may need to be iterated to get them right, and plots may also help this process. The
source code is domain-independent and therefore located in the ALL/topo subdirectory.
6.3
HYCOM allows for the conversion of MICOM bathymetry files to a corresponding HYCOM
bathymetry file using the program T M2H in topo m2h.f. Since the MICOM and HYCOM
bathymetry files leave out the last row and column (which are always outside the region), the
conversion from MICOM bathymetry to HYCOM is simple (HYCOM idm,jdm):
do j= 1,jdm-1
do i= 1,idm-1
dh(i,j) = dm(jdm-j,i)
enddo
enddo
The file topo m2h.f defines the domain-specific header of the bathymetry file and must
be customized for each domain. The suggested resolution for values in the header is at least
16
BUILDING A BATHYMETRY
five significant digits. Some MICOM bathymetry files use an alternative PAKK encoding. If
the HYCOM bathymetry from TOPO M2H does not look correct, try TOPO MALT2H, which
links in pakk micom.f instead of pakk hycom.f. The only differences between these files are six
lines in the subroutine UNPAKK (lines 107-112).
17
After establishing a bathymetry for the domain, the user must interpolate wind data to the
HYCOM grid for the region chosen so that data can be input to the model. In order to obtain
input files to run in the HYCOM model, the user must do the following steps:
1. Obtain wind or flux data for the region being run in HYCOM,
2. Create a wind offset file, and
3. Run scripts coads mon wind.com or coads mon flux.com that call programs WNDINT
or FLXINT to interpolate wind or flux data to the HYCOM grid.
The script new force.com has been provided in the ATLb2.00/force subdirectory. This script
can be used to edit the forcing scripts for a new region. These scripts can then be run in the
XXXaN.NN/force subdirectory to interpolate atmospheric forcing fields to the new region.
18
7.1
The user can obtain wind data produced by the Comprehensive Ocean-Atmosphere Data Set
project (COADS) from the HYCOM ftp site (ftp://obelix.rsmas.miami.edu/awall/hycom/coads for hycom.tar.gz). The COADS wind files are formatted in Naval Research Laboratory
(NRL) format that consists of a Fortran unformatted sequential file with a single header record
identifying wind dates followed by the wind or flux data (coads mon taqaqrqppc.d). This
is a format used at NRL to avoid dealing with multiple wind file formats. All wind sets are
converted to this format so that interpolation programs can use a single input routine for any
wind set. There can be up to 5,999 sample times in one file and the very first record of the file
contains the array size, the array latitude, longitude, the array grid size, the number of samples,
and a 6,000 element array that lists the wind day of each sample time (and of the next time
in sequence beyond the last record). Here wind day is days since 00Z December 31 1900. To
use the interpolation programs, either convert your atmospheric forcing data to this format or
modify the native grid reading subroutines to input fields in their existing format. There are
many programs already available to convert wind sets to the required format. To see if the one
needed is available , send e-mail to metzger@hermes.nrlssc.navy.mil.
The wind stat command located in the bin subdirectory summarizes the contents of the
native wind or flux data file. In Figure 5, an example is given of the COADS data file summary.
Note that Qr is based on COADS constrained net flux, and Qp=Qr+Qlw. The wind days (1111.00
to 1477.00) and dates (16.00/1904 - 16.00/1905) are ignored by the interpolation programs.
All that matters is that the file contains 12 monthly sets of fields. The COADS wind grid
specification parameters are outlined in Table 8.
7.2
Wind offset
Parameter
iwi
jwi
xfin
yfin
dxin
dyin
wmks
hmks
rmks
pmks
7.2
19
Wind offset
Before interpolating COADS winds to the HYCOM grid, a wind offset input file must be
read in. The offset file allows the annual mean wind to come from a different wind data set
and is usually set to zero. Therefore, the first step for wind generation is to create the file
tau[en]wd zero.[ab] using the script tauXwd zero.com in the offset subdirectory for the
specified model region. The offset can also be a different field for each sample time. This allows
the combining of a climatology and anomaly field.
7.3
The programs WNDINT and FLXINT carry out the interpolation of wind or flux files from their
native grid to the HYCOM model grid. The interpolation methods used by the programs are
piecewise bilinear or cubic spline. WNDINT and FLXINT are used as part of the standard
HYCOM run script to produce the 6 or 12-hourly wind or fluxes for actual calendar days from
the operational center wind or flux files. This is done just in time(i.e., the files for the next
model segment are generated while the current model segment is running).
The only difference in the program WNDINT for the files wi 100 co.f and wi 1125 ec.f
(for the one-degree COADS winds and 1.125-degree ECMWF winds, respectively) is the inclusion
of different WIND*.h include files that define the native wind and/or flux geometry and units
for each file. The most common allowable grid types for global atmospheric data sets are
uniform latitude/longitude or uniform in longitude with Gaussian global latitude. Use the script
WIND update.com to create source files for other known native grids from the * 100 co.f
version. The script should be updated when another native grid is added.
20
File
Input files
COADS
tau[en]wd zero.[ab]
Output files
airtmp.[ab]
precip.[ab]
radflx.[ab]
shwflx.[ab]
tauewd.[ab]
taunwd.[ab]
vapmix.[ab]
wndspd.[ab]
7.4
Output
The output data sets consist of atmospheric forcing in MKS units. Heat flux is positive into
the ocean. The output files consist of the COADS fields interpolated to the HYCOM grid in
HYCOM 2.1 array .a and header .b format. A list of the current HYCOM atmospheric
forcing output and input files is listed in Table 9. An example file is provided of wind stress
output (See Figure 6).
The output files also include any bias or minimum wind speed constraints. For example, in the subdirectory ATLb2.00/force/coads, compare coads mon flux.com (zero bias) to
coads mon flux+070w.com (70w bias). An all zero precipitation file input to HYCOM
indicates that there should be no evaporation-precipitation surface salinity forcing (See precip zero*). By default the output wind stress components are on the native u and v grids,
but setting NAMELIST variable IGRID=2 will output wind stress component on the pressure
grid (which is always used for all other atmospheric forcing fields). When running HYCOM,
use wndflg=1 for u/v winds and wndflg=2 for winds on the pressure grid (wndflg set in blkdat.input). Note that in all cases the HYCOM wind stresses are orientated with respect to the
local HYCOM grid (i.e., taunwd (tau y) is only actually North-ward when the HYCOM grid
is E-W/N-S). The actual surface forcing fields and their units are found in Table 9.
7.4
Output
21
i/jdm,iref,reflon,equat,gridsz/la =
57
52 1 -97.000
tau_ewd: month,range = 01 -1.3818002E-01
2.1997201E-01
tau_ewd: month,range = 02 -1.3790721E-01
1.8771732E-01
tau_ewd: month,range = 03 -1.3121982E-01
1.4483750E-01
tau_ewd: month,range = 04 -1.1269118E-01
8.3349183E-02
tau_ewd: month,range = 05 -1.0441105E-01
7.3009044E-02
tau_ewd: month,range = 06 -1.3582233E-01
6.2626213E-02
tau_ewd: month,range = 07 -1.4753306E-01
5.9464872E-02
tau_ewd: month,range = 08 -1.0999266E-01
6.8312079E-02
tau_ewd: month,range = 09 -1.0449981E-01
1.0520227E-01
tau_ewd: month,range = 10 -9.0205058E-02
1.4911072E-01
tau_ewd: month,range = 11 -8.7021284E-02
1.7461090E-01
tau_ewd: month,range = 12 -1.2721729E-01
1.9768250E-01
ajax 97> hycom_range tauewd.a 57 52
min, max = -0.13818002 0.21997201
min, max = -0.1379072 0.18771732
min, max = -0.13121982 0.1448375
min, max = -0.11269118 0.08334918
min, max = -0.10441105 0.073009043
min, max = -0.13582233 0.06262621
min, max = -0.14753306 0.059464871
min, max = -0.10999266 0.06831208
min, max = -0.10449981 0.10520227
min, max = -0.09020506 0.14911072
min, max = -0.087021283 0.1746109
min, max = -0.1272173 0.1976825
12
FIELDS PROCESSED
22
Vertical structure parameters are selected when editing blkdat.input for each model simulation
(see Figure 7). The blkdat.input file is located in the ATLb2.00/expt subdirectory. If setting
up for a new domain, blkdat.input will need to be edited for the vertical structure chosen by
the user.
1. Begin by specifying the fixed vertical grid near the surface through the number of sigmalevels (nsigma, which is 0 for all z-levels).
2. Next, choose the minimum sigma thickness (parameter dp00s) and the minimum (dp00).
The kth layers minimum z-thickness is dp00f**(k-1)*dp00, but if k is less than nsigma the
minimum thickness is the smaller of this value and the larger of dp00s and depth/nsigma.
This approach gives z-levels in deep water, optionally going to sigma-levels in coastal water
and back to z-levels in very shallow water.
3. Finally, select a z-level stretching factor (parameter dp00f, which is 1 for uniform z-levels).
0
3.0
12.0
1.125
3.0
12.0
1.125
nsigma
dp00
dp00x
dp00f
ds00
ds00x
ds00f
=
=
=
=
=
=
=
number
deep
deep
deep
shallow
shallow
shallow
23
The next step in setting up HYCOM is to interpolate the temperature and salinity climatology to
the model domain and the vertical structure. There are three steps for setting up the climatology
for model initialization:
1. Obtain temperature and salinity data files,
2. Interpolate climatology to the HYCOM grid using the program TSZINT, and
3. Convert climatology from z-levels to isopycnals using the program RELAX.
24
9.1
The climatology data sets are input on their native grid from standard Naval Research Laboratory (NRL) LEVITUS climatology files. LEVITUS files in the required format are available from
the HYCOM ftp site: ftp://obelix.rsmas.miami.edu/awall/hycom/levitus for hycom.tar
.gz. The user can use the clim stat command located in the bin subdirectory to list the contents of a native climatology file. All fields in the native file must be defined at all grid points.
This includes over land and below the ocean floor. In addition, the potential density vertical
profile must be stable at all locations. Table 10 gives the grid specification parameters for the
climatology files.
Parameters
iwi
jwi
jwi
xfin
yfin
dxin
dyin
9.2
The process of interpolating LEVITUS climatology files has been split into two phases. This
saves time because the z-level climatology does not depend upon the isopycnals chosen by a
particular HYCOM simulation. The first phase requires the generation of a formatted model
grid climatology file suitable for input to the HYCOM isopycnal climatology generator. The
climatology is first interpolated to the HYCOM horizontal grid at its native fixed z-levels by
the program TSZINT (in z levitus.f). The user must run the script z levitus sig[02].com,
which will call TSZINT. This script is located in the ATLb2.00/relax/levitus subdirectory. The
interpolation is performed using either piecewise bilinear or cubic spline. The interpolated
climatology is defined at all grid points. Again, this includes land and below the ocean floor, and
its potential density vertical profile is stable at all locations. The native LEVITUS climatology
is defined using sigma0, but the interpolated climatology may be set as sigma0, sigma2, or
sigma4.
9.3
The second phase is vertical mapping from z-levels to isopycnals, which is based on Rainer
Blecks re-step procedure for converting one stair step (i.e., piecewise constant) set of profiles
9.4
Output
25
File
Input files
dens sig0 m01.[ab]
Output files
relax.0000 ### 00.[ab]
relax int.[ab]
relax sal.[ab]
relax tem.[ab]
Monthly LEVITUS file of potential density interpolated to HYCOM horizontal grid using subroutine
TSZINT.
Monthly LEVITUS file of potential salinity interpolated to HYCOM horizontal grid using TSZINT.
Monthly LEVITUS file of potential temperature interpolated to HYCOM horizontal grid using TSZINT.
Created from a previous blkdat.input file using the
script blkdat.com.
(in this case between z-levels) into another with prescribed density steps. The user must run the
script relax.com, which calls the program RELAX to perform the conversion of the interpolated
climatology to the isopycnal climatology required for a particular HYCOM simulation. The
region and simulation specific environment variables needed by program RELAX are located
in EXPT.src. If the experiment or region is changed, this file must be modified. Program
RELAX does not depend on the climatology that is being used, provided that all the native
climatologies use the same number of z-levels in the vertical. Use relax sig0.f for sigma0 and
relax sig2.f for sigma2 HYCOM density coordinates. The required input file, blkdat.input, is
located in the relax/010 subdirectory and can be created from a previous experiment version of
blkdat.input using the script blkdat.com.
9.4
Output
The output fields generated from this procedure include climatological interface depth, potential
temperature and salinity for the specified set of isopycnal layers and minimum near-surface layer
thicknesses (Table 11). The fields are output in array (.a) and header (.b) file format. Programs
TSZINT and RELAX each handle a single set of climatology fields. Since HYCOM expects six
bi-monthly or twelve monthly sets of fields in a single file, the individual output climatology
files must be concatenated before use. The output fields can be plotted using the standard
HYCOM archive file plot program HYCOMPROC. Note that relax zon[02].f, a special case of
relax sig[02].f, writes out zonal interface depth averages. It can be used to calculate regionspecific values for zonal initialization via HYCOMs poflat.f.
26
The HYCOM climatology can be used for initializing the model (iniflg=2), for surface
relaxation to augment surface atmospheric forcing (trelax=1 and/or srelax=1), and for lateral
boundary nudging (relax=1). In the latter case, a relaxation mask is required to specify where
and how much relaxation to apply. It can be generated by rmu.f.
9.5
A 2-D relaxation mask is required for any HYCOM simulation that uses lateral boundary nudging. It is typically zero everywhere except the boundary regions where relaxation is to be applied.
In the boundary regions, set the mask to the relaxation scale factor (1/second). Use program
RLXMSK (rmu.f ) to specify the boundary relaxation zones (see relax rmu.com). Input is
up to 99 individual patches and the associated e-folding time in days (converted internally to
1/e-folding-time-in-seconds for assignment to rmu.f).
9.6
The relax/999 subdirectory contains scripts for the user to generate zonal statistics that can
be used to customize the pdat array in HYCOM subroutine poflat. The poflat subroutine
represents the depth of potential density 21.0, 21.5, ... , 28.0 in a specified latitude range. Note
that this script is only used for initialization, and only when iniflg=1.
The script blkdat.com must be run first to generate a blkdat.input file. The next step is
to run the script relax zon.com, which is the primary zonal climatology script that generates
the statistics. The script sig lat.com extracts values needed for poflat.
The following sequence of commands are used to generate the statistics:
>csh blkdat.com
>csh relax_zon.com >& relax_zon.log
>./sig
>csh sig_lat.com >& sig_lat.log
The ./sig command generates the zonal tables and plots using the gnu plot script sig.gnu.
The awk script sig lat.awk has been provided for interpolating to a given latitude. The last
step is to use the following command:
>cut -c 13-18 sig_lat.log | paste -s -d ",,,,,,,,,,,,,,,\n" - | sed
-e s/ *//g -e s/^....../
+/ >! sig_lat.tbl
to extract sig lat.tbl from sig lat.log. This only needs minor editing for use in poflat (see
sig lat.data).
27
10
The makefiles necessary to run HYCOM setup programs and the model simulation source
machine-specific configuration files that contain the architecture type and/or the parallelization strategy type. The diagnostics in ALL are completely separate from the model code. The
first step the user must take is to compile the set-up programs in the ALL directory, which is
domain independent:
1. Make sure a setup program configuration file ($(ARCH) setup) exists for the particular
machine architecture being used. If it does not exist, the user will have to create one.
2. Edit Make all.src.
3. Run Make all.com from the ALL root directory.
Then for each model domain the user will need to do the following:
1. If setting up HYCOM for a new stand-alone region, create a new source code directory for
the region.
2. Check available machine-specific configuration files ($(ARCH) $(TYPE)) to be sure a file
exists for the particular machine architecture and type of system the model is being run
on. If it does not exist, the user will have to create one.
3. Edit the script Make.com and the dimensions.h file.
4. Run Make.com from the source code directory (e.g., ATLb2.00/src *).
The following subsections of this chapter provide details of configuring the files for the setup
programs versus the model domain, and also detail how to edit and run the compilations for
each step. Further explanation of compiling the code for the Atlantic 2.00 degree domain or a
new domain are also presented.
10.1
10.1.1
Setup Programs
Configuration Files for Setup Programs
HYCOM Version 2.1 has several setup program configuration files available for the makefiles
to source. These files are formatted as $(ARCH) setup, where ARCH defines exactly what
machine architecture to target. They are located in the setup program directory ALL under
the config subdirectory (See Table 12). These files contain the environmental variables needed
to run HYCOM setup programs for a specific machine architecture. For machines not listed in
Table 12, new $(ARCH) setup files must be created by the user. See Table 13 for a list of the
environmental variables and their descriptions that must be defined in the $(ARCH) setup file.
The file $(ARCH) setup is typically identical to HYCOMs standard configuration file
$(ARCH) one4, if real is real*4. If real is real*8 but real*4 is available, $(ARCH) setup is like
$(ARCH) one except that the macro REAL4 is set (in addition to REAL8).
28
File
10
alpha setup
alphaL setup
intel setup
intelIFC setup
o2k setup
sp3 setup
sun setup
sun64 setup
t3e setup
Variable
FC
FCFFLAGS
CC
CCFLAGS
CPP
CPPFLAGS
LD
LDFLAGS
EXTRALIBS
Compaq Alpha.
Compaq Alpha Linux.
Intel Linux/pgf90.
Intel Linux/ifc (little-endian)
SGI Origin 2800.
IBM SMP Power3.
Sun (32-bit).
Sun (64-bit).
Cray T3E.
Table 13: Environment variables in configuration files.
Description
Fortran 90 compiler.
Fortran 90 compilation flags.
C compiler.
C compilation flags.
CPP preprocessor (may be implied by FC).
CPP -D macro flags (See README.macros).
Loader.
Loader flags.
Extra local libraries (if any).
Some IBM SP filesystems (e.g. GPFS) cannot be used to compile Fortran modules. If
the src directory is on such a filesystem, use TYPE=sp3GPFS instead of TYPE=sp3 (i.e.,
the configuration file is sp3GPFS setup instead of sp3 setup). This version does the compile
on a non-GPFS filesystem, which is currently set to /scratch/$(USER)/NOT GPFS. Since all
compiles use this directory, only perform one make at a time.
In addition, make suffix rules are required for creating object (.o) files from the program
files (.c, .f, and .F) (see Figure 9). Note that the rule command lines start with a tab character.
10.2
29
#
# rules.
#
.c.o:
$(CC) $(CPPFLAGS) $(CCFLAGS)
-c $*.c
.f.o:
$(FC)
$(FCFFLAGS) -c $*.f
.F.o:
$(FC) $(CPPFLAGS) $(FCFFLAGS) -c $*.F
10.2
10.2.1
The first step in compiling the setup programs is to edit the file Make all.src for the correct machine architecture (ARCH). All components of Make all.src are hard linked together, so
the file needs to be edited only once. The */src/Makefiles are configured to key on ../../config/$(ARCH) setup for machine-dependent definitions. For example, when running on a Linux
PC, ARCH is intel and an individual make command might be the following:
Make all.com
Once the Make all.src has been edited, run Make all.com from the ALL root directory using
the following command:
30
10
make all) builds all executables from scratch. The script Make clean.com removes all machine specific executables but should only be required when updating to a new compiler version.
Issue the csh Make clean.com command in the ALL root directory to run Make clean.com
in each */src directory.
On a new machine type, Make all.com should be run to recompile all the *.[Ffc] source codes
to create executables ending in machinetype, where machinetype is typically the output of
uname, which is soft linked to the standard executable name. The c-shell scripts clim stat,
wind stat and hycom sigma invoke * machinetype using a hardwired path. The path and
possibly the machinetype definition may need modifying for the users particular setup. The
Gnuplot plot package is also used by hycom sigma, and its location must be specified. This
can be achieved by invoking the command csh Make all.com. It will generate a warning if the
c-shell scripts need modifying.
Make all.com in ALL/bin does not use Make all.src, but it should only need editing if you
are running Solaris and would prefer 64-bit to 32-bit executables. Running Make all.com in the
ALL root directory invokes ALL/bin/Make all.com.
10.3
10.3.1
Model Code
Configuration Files for Model Run
The configuration files for the model run are found in the ATLb2.00 directory under the subdirectory config. They are formatted as $(ARCH) $(TYPE), where ARCH defines the machine
architecture and TYPE is the parallelization strategy and precision (ONE, ONE4, SETUP, OMP,
MPI, MPISR, or SHMEM). Table 14 provides a list of available machine-specific configuration
files.
10.3.2
The example source directory (src 2.1.03 22 one), and scripts (expt 01.5/*.com) are currently
configured for a single processor. To compile HYCOM, simply run Make.com from the src *
directory. Each executable is then created by invoking the following command:
./Make.com >& Make.log
If HYCOM is being run on a different system configuration, the script Make.com in the src *
directory will have to be edited to define $ARCH appropriately for the machine, and dimensions.h will need to be modified for different shared memory types (one, omp) and distributed
memory types (mpi, ompi, shmem) (See Section 5.2). There is no need to create an executable
for every parallelization technique, just for the one that you plan to actually use.
10.3
File
Model Code
31
alpha one{4}
alpha omp
alphaL one{4}
intel one{4}
intel omp
o2k one{4}
o2k omp
o2k mpi
o2k shmem
sp3 one{4}
sp3 omp
sp3 q64omp
sp3 ompi
sp3 mpi
sun64 one{4}
sun64 omp
sun64 ompi
sun64 mpi
sun one{4}
sun omp
sun ompi
sun mpi
t3e one
t3e mpi
t3e shmem
32
Macro
ALPHA
AIX
ARCTIC
DEBUG ALL
DEBUG TIMER
ENDIAN IO
IA32
MPI
MPISR
NOMPIR8
REAL4
REAL8
RINGB
SERIAL IO
SHMEM
SGI
SSEND
SUN
TIMER
T3E
YMP
Macro
REAL4
REAL8
ALPHA
AIX
ENDIAN IO
HPUX
IA32
SGI
SUN
T3E
YMP
10
10.3
Model Code
Macro
BARRIER
HEADER
MPI ISEND
MPI SEND
MTYPED
MTYPEI
MTYPER
SHMEM GETD
SHMEM GETI
SHMEM GETR
SHMEM MYPE
SHMEM NPES
33
34
11
11
35
9. Change the model region and/or domain size. Dimensions.h is the only source code
file that should need changing for a new region or a different number of layers. Refer back
to Section 4.2 for more details.
10. In order to change to a new model region and/or domain size, see Sections 5 and 22.
36
12
12
RUNNING HYCOM
Running HYCOM
Before beginning a HYCOM model run, the bin directory must be present in the users primary
path. The bin directory contains HYCOM commands and aliases that may be used throughout the run. Appendix A provides a complete list of HYCOM utility commands and their
definitions. For commands without manual pages, the header of the script or the source code
contains usage information. Invoking the command with no arguments will print a single line
usage message.
The process of running a simulation is optimized for batch systems, but will also work
interactively. The basic procedure is that each invocation of the ocean model results in a run
for a fixed length of (model) time (e.g., one month, or three months, or one year, or five years).
Each invocation has an associated run script that identifies which year or part year is involved
(e.g., 015y005.com or 015y001a.com where 015 is the experiment number)). The initial year is
indicated by y followed by three digits, and if there are multiple parts per year this is indicated
by a letter following the year digits. All of the scripts mentioned in the following sections can be
found in the ATLb2.00/expt subdirectory. The msub source codes are located in the ALL/bin
subdirectory.
12.1
Each actual model script is created from a template script using an awk command. For example,
015.awk modifies the template script 015.com. The number of years per run can be changed
by editing ny in 015.awk and ymx in 015.com. The 015.awk and 015.com files are presently
configured for one year runs, as described by # and C comment lines therein. Actual scripts for
single model jobs for the first three years, for example, could be generated manually using the
following:
awk -f 015.awk y01=1 015.com > 015y001.com
awk -f 015.awk y01=2 015.com > 015y002.com
awk -f 015.awk y01=3 015.com > 015y003.com
If 015.awk were configured for six month runs (by setting np=2 in 015.awk; where np is
the number of parts that the year is divided into), the two scripts for the first year could be
generated manually using:
awk -f 015.awk y01=1 ab=a 015.com > 015y001a.com
awk -f 015.awk y01=1 ab=b 015.com > 015y001b.com
12.2
Manual generation of scripts is rarely necessary. The process has been automated for batch
runs. There are several command choices for the user to perform a batch run, depending on the
type of queuing system used:
12.2
msub
msub
msub
msub
msub
msub
37
(for
(for
(for
(for
(for
(for
CODINE batch)
GRD batch)
LoadLeveler batch)
LSF batch)
NQS/NQE batch)
PBS batch).
Any of these also work with the default msub command msub csh for interactive background
runs.
These scripts read the first line of the LIST file generated by mlist (see below). The scripts
either generate a new segment script (if the line is of the form year segment, such as 001 a, or
of the form year, such as 001), or they use the indicated existing script (e.g., 015y001.com).
The new script is run, and upon its completion the first line is removed from LIST, and the job
either exits (if LIST is empty), or cycles again (based on the number of segments it is configured
to run), or is resubmitted for the next segment. The number of segments per job should be
chosen based on batch run time limits, and is specified by a foreach loop in the script - this is
currently five in 015lsf.com:
C
C --- Number of segments specified by ( SEG1 SEG2 ... ) in foreach.
C --- Use ( SEG1 ) for a single segment per run.
C
foreach seg (SEG1 SEG2 SEG3 SEG4 SEG5)
Therefore, the first step to running in batch mode is to generate a LIST file for a sequence
of years. This is done by invoking the command mlist. For example,
mlist 1 30 1
generates a list of model years 1 to 30 in steps of 1 year. Note that mlist will only be invocable
by name if the hycom/ALL/bin directory is in your environment variable $PATH. The advantage
of separating out the run list, in LIST, from the batch cycling, via msub, is that this gives much
finer control of the run process.
The command msub (msub csh, msub codine, msub grd, msub ll, msub lsf, msub nqs,
msub pbs) then runs the script. For example, type
msub_nqs 015nqs.com 01
to run the script for a NQS/NQE queuing system. In this command line, the final 2 digits
identify the job number.
When running in batch (in addition to setting the number of segments per job), the batch
script will need configuring to request the appropriate resources. This is batch system specific,
but it usually involves editing the header of the corresponding batch script. For example, under
LoadLeveler, lines that start with #@ are interpreted by the batch system and 015rll.com is
configured to run for two wall hours on a single (4-processor) node with four MPI (or OpenMP)
tasks:
38
12
#!/bin/csh
#
#@ job_name
#@ output
#@ error
#@ restart
#@ job_type
#@ network.MPI
#@ environment
#@ node
#@ total_tasks
#@ node_usage
#@ wall_clock_limit
#@ account_no
#@ class
#@ queue
#
=
=
=
=
=
=
=
=
=
=
=
=
=
RUNNING HYCOM
XXXrll
$(job_name).log
$(job_name).log
yes
parallel
css0,not_shared,US
MP_EUILIB=us
1
4
not_shared
2:00:00
NRLSS018
batch
Some of the above lines will definitely need editing for your local setup. Note that msub ll
inserts the correct name on the job name line, so XXXrll is ok here for any experiment number.
12.3
Dummy*.com
The dummy*.com scripts do nothing at all, and can be inserted into LIST if you want a particular
alignment of runs within a .log file. For example, if there are four segments per year and
051lsf.com runs four segments, then normally one year will be in each .log file. However, if the
simulation starts in the Summer the first year can still be in the entire first .log file by using
the following LIST configuration: 051y001C.com 001 d dummyA.com dummyB.com 002 a 002
b 002 c 002 d. In this case the first script, 051y001C.com, would differ from the standard
051y001c.com script (automatically generated by 001 c in LIST) by replacing LIMITS with
LIMITI to generate an initial limits file. The dummy scripts can similarly be used to maintain
alignment of runs in the .log file after restarting from a model crash.
Note that if the batch system crashes, there will be RUNNING* files left in this directory
that must be deleted before resubmitting the job.
12.4
The file blkdat.input contains the input parameters that must be set prior to running HYCOM.
Generate this file from a previously written blkdat.input file using the blkdat.com script. Edit
the blkdat.input file for region-specific and experiment-specific parameter changes before the
model is run. Appendix B contains definitions for model input parameters in blkdat.input.
39
13
13.1
HYCOM contains two programs, HYCOM MEAN and HYCOM STD, that can be used to
calculate the mean, mean-squared, and standard deviation of a sequence of archive files. HYCOM STD calculates the standard deviation of archive files using their mean and mean-squared
files that were generated by the program HYCOM MEAN. The reason for using mean and meansquared files to generate the standard deviation (rather than generating it from the mean and
the original archives) is that this approach allows incremental calculations. For example, you
can generate the annual mean and mean-square from each year of the run as it becomes available
and later merge them together to form five year mean and mean-square files (and then a five
year standard deviation file). This is done with the same HYCOM MEAN program, which can
used incrementally (i.e., a previously calculated mean or mean-squared can be part of the input
to form extended means). For example, in eddy resolving cases it would be typical to produce
and plot monthly or seasonal means and them combine them into annual and multi-year means
and standard deviations.
The layered means are weighted by the layer thickness, but the mixed layer means and all
non-layered means are simple means. Weighting by layer thickness is clearly the right approach
for isopycnal layers (since the means are then in density space) and is equivalent to a simple
mean for any constant thickness layers near the surface. However, layers that are sometimes
isopycnal and sometimes near-surface (constant thickness) can be difficult to interpret. Seasonal
means may help keep the two layer modes separate.
In order to run HYCOM MEAN, the script 010 mn+sq 0020.com has been provided in
the subdirectory ATLb2.00/meanstd. The script 010 std 0020.com is run to form standard
deviations.
13.2
The next section on Plotting Results (Section 14) details how to plot the resulting mean and
standard deviation files. Example input files, 010y020MN.* and 010y020SD.* have been provided
in the ATLb2.00/plot subdirectory.
Note: Be careful when interpreting vertical sections that use depth (rather than layer number) as the vertical axis. These associate mean layer quantities with the mean location of the
layer. If the layers location or thickness is highly variable, then its mean location and thickness
may not be as good a representation of the layer as you expect. For example, suppose a layer
were 100 m thick in the summer but inside the mixed layer. Therefore, in the winter it might
only be a 10 m z-layer. Then its mean, layer thickness weighted density would be close to its
summer (isopycnal) value but its mean thickness would be 55 m which isnt close to either its
winter or summer value. It could also be in a location that it almost never occupies in the
water column. Using seasonal means should help reduce the impact of such hybrid coordinate
variability.
40
14
14
There are several plotting options available in HYCOM. The user can generate plots from model
output, archive files and also from any HYCOM .a file produced during the configuration
process. The directory ALL/plot contains the source code for plotting HYCOM archive files
and 2-D fields using NCAR graphics. Alternatively, fields can be output in several common
data formats by programs in ALL/archive. The fields can then be plotted by the users desired
graphics package. See Section 17 for extracting 2-D and 3-D diagnostic field into data files
(several formats) suitable for plotting by other graphics packages.
HYCOM has two standard plotting packages available for the user to plot results, HYCOMPROC and FIELDPROC. The plot program HYCOMPROC can plot x-y surface fields,
x-y layers, x-z slices and y-z slices. It can also plot a surface field only archive file by setting
kk=1 in the source file hycomproc.f. In addition, surface mean or standard deviation files can be
plotted.(see: 010srf*.IN and ../meanstd/README.meanstd). The directory ALL/relax/plot
contains example plots that are produced from the dummy archive version of each monthly
climatology using HYCOMPROC.
The program FIELDPROC, which is based on HYCOMPROC and has similar input, will
plot any 2-D horizontal scalar field from a HYCOM .a data file. It will also plot fixed z-depth
x-y plots after first running the program ARCHV2DATA3Z to interpolate hybrid layers from
the archive file to a fixed depth. Once the interpolation is performed, the program FIELDPROC
can be run(see Section 17.6).
Both of these plotting packages are completely region independent and can display the full
domain or a sub-region. The number of plots per frame, spacing of latitude/longitude labels
and grid lines, and the location of the contour label and color bar are all specified at run-time.
All plots use NCAR graphics and are in logical space (i.e., every grid cell is the same size on the
plot).
There are three main steps in using the HYCOM plot programs. These steps are outlined
below:
1. Compile the plot programs,
2. Create a plot input file (*.IN),
3. Run the plot program.
In the following sections, these steps will be explained in detail.
14.1
Typically, all executables are created just once by editing the script Make all.src for the correct
ARCH and then issuing the command csh Make all.com as described in Section 10, Configuring
and Compiling the Model Code. However, the user can perform individual makes if needed. The
plotting makefile is configured to key on ../../config/$(ARCH) setup for machine-dependent
definitions (See Section 10.1). After ascertaining that the correct setup configuration file is
present, the user can issue an individual make command. For example, when running on a
Linux PC, ARCH is intel and an individual make command might be
14.2
41
14.2
Before running either of the plotting packages, an input file must be generated by the user. An
example input file is provided in Appendix C. The input file tells the plotting package what data
to plot and how it should be plotted. The first line of the input text file identifies the dummy
archive file to plot. For example, the only difference between the summer and winter input files
010y020s.IN and 010y021w.IN is the first line of each file:
../expt_01.0/data/archv.0020_196_00.b
versus
../expt_01.0/data/archv.0021_016_00.b
The first line identifies the archive file to plot and so must be different in every case. HYCOM
1.0 archive files, which are signaled by their filename (without an .a or .b), can also be plotted
using HYCOMPROC and/or FIELDPROC in this way. Several input files and their output have
been generated for the example experiment 1.0 run on the Atlantic 2.00 domain (See directories
ATLb2.00/relax/plot and ATLb2.00/force/plot). These files can be used as further examples of
*.IN files input to the plot programs.
The input file also allows very fine control over exactly what is plotted. For example, not all
layers need to be included in the list of layer-by-layer plots. In fact, the same layer can appear
more than once (giving fine control over the plot order). A negative number for the parameter kf
indicates that the layer number should be displayed on the plot. Otherwise, the layers nominal
isopycnal density is displayed. Note that the parameters noisec and nojsec (the number of
z-sections to plot) must be followed by exactly the specified number of isec and jsec lines,
respectively.
42
14
Option
0
1
2
3
4
5
6
If the color palette is multi-color (kpalet>1) and a positive contour interval is specified,
then the next input value is center to identify the central value of the color bar range. The
actual range then depends on the number of distinct colors in the palette (either 64 or 100).
The color options available in HYCOM are listed in Table 18.
14.3
Aliases
The aliases in alias.src (source alias.src) can be used to simplify the generation of plots. The
usage of these aliases is straightforward. For example, the command fp2ps coads airtmp.IN
will generate the files coads airtmp.log and coads airtmp.ps. Any of the aliases can be used to
generate plots by following this formulation: Type x p2yy plot.IN to create plot.log and plot.ps
or x p2x plot.IN to create plot.log and X11 window. The plotting aliases and their functions
are listed in Table 19.
14.4
Troubleshooting HYCOMPROC
Note that in the contouring subroutine CONREC there is a fixed size workspace buffer. If the
run time error AREA-MAP ARRAY OVERFLOW appears, try increasing the size of the parameter
Alias
hp2ps
hp2gv
mp2ps
mp2gv
fp2ps
fp2gv
hp2x
mp2x
fp2x
14.5
Plotting a Sub-region
43
lgthmp in the file conrec.f. Similarly, MCS TOO SMALL refers to the parameter lgthwk in
conrec.f. These parameters are currently set relatively large by default.
14.5
Plotting a Sub-region
In order to plot a subregion, the location of the sub-region is set at run time by specifying the
location (iorign,jorign) on the full grid of (1,1) on the subregion grid and its size (idmp,jdmp).
For the full region, iorign=jorign=1 and idmp=jdmp=0. All other input parameters can be
the same for the full region and a subregion, but note that isec and jsec are with respect to
the subregion rather than the full region.
14.6
The plotting program MICOMPROC is used to plot MICOM archive files. The source file
micomproc.f is identical to hycomproc.f except that lhycom is false. The parameter iexpt
must be explicitly specified (since it is not in the archive file) in the input file for MICOMPROC,
otherwise it is identical to the HYCOMPROC input file. Since MICOM is in CGS and uses a
N-S then W-E grid orientation, the input is immediately rotated (W-E then S-N) and converted
to MKS. Note that the bottom topography and all of the input parameters are always from
HYCOM.
The advantage of using essentially the same program to plot both models is that the display
layout is identical and only one plot program needs to be maintained. Note also that HYCOM
now has a MICOM-like isopycnal mode. This produces HYCOM archive files that are plotted
using HYCOMPROC. However, the existence of MICOMPROC makes it very easy to compare
isopycnal HYCOM simulations with any corresponding actual MICOM cases.
Another option is to convert MICOM (and/or HYCOM 1.0) pakked archive files to HYCOM
2.0 .[ab] format using the program HYCOMARCHV(See Section 17.1). The resulting files can
then be plotted with HYCOMPROC .
44
15
15
Table 20 contains one line descriptions of all HYCOM I/O routines. All of these routines are
assumed to be called with identical argument lists by all processors when using SPMD message
passing. This is not difficult to arrange, since by default all routines are called in this manner
in a SPMD run. The ZAIO routines are used to process HYCOM .a files, which contain array
data only. The ZAGETC routine is used to process HYCOM .b plain text files. These are
only opened on the first processor, so under MPI ZAGETC reads a line on the first processor
and then broadcasts it to all other processors.
Two versions of each subroutine are provided, mod za mp.F for message passing, and
mod za sm.F for a single processor (and OpenMP). The appropriate version is included in
mod za.F under control of cpp macros. The routines are configured as a module, and all HYCOM
routines should start with use mod za to allow HYCOM communication routines to be invoked
when required.
A special version of each subroutine is also in ALL/libsrc/mod za.F. This implements the
identical set of subroutines, but for pre- or post-processing programs only. These are all single
processor programs, and ALL/libsrc/mod za.F is therefore similar to mod za sm.F except that
the array size idm, jdm is set at run time. A related API with za replaced by zb is in
ALL/libsrc/mod zb.F. It is only used when reading in the full domain (via ZAIORD), but
writing out a sub-domain (via ZBIOWR).
Routine
ZAGETC
ZAIOST
ZAIOPN
ZAIOPE
ZAIOPF
ZAIOPI
ZAIOCL
ZAIOFL
ZAIOIQ
ZAIORW
ZAIORD3
ZAIORD
ZAIOSK
ZAIOWR3
ZAIOWR
45
16
Table 21 contains one line descriptions of all HYCOM communication routines. With the exception of XCHALT, all of these routines are assumed to be called with identical argument lists by
all processors when using SPMD message passing. This is not difficult to arrange, since by default all routines are called in this manner in a SPMD run. Most communication routines act as
implicit barriers that synchronize processor state (i.e., when a processor exits a communication
routine, all processors that must communicate with it have entered the same subroutine). In
addition, the subroutine XCSYNC has been provided for cases where all processors must enter
a critical section of code before the first processor exits.
Two versions of each subroutine are provided, mod xc mp.F for message passing, and
mod xc sm.F for a single processor. The appropriate version is included in mod xc.F under
control of cpp macros. The routines are configured as a module, and all HYCOM routines should
start with use mod xc to allow HYCOM communication routines to be invoked when required.
The programs in the ATLb2.00/src/TEST subdirectory confirm that individual communication
routines are working.
Routine
XCAGET
XCAPUT
XCEGET
XCEPUT
XCHALT
XCLGET
XCLPUT
XCMAXR
XCMINR
XCSPMD
XCSTOP
XCSUM
XCSUMJ
XCSYNC
XCTBAR
XCTILR
XCTMRI
XCTMR0
XCTMR1
XCTMRN
XCTMRP
46
17
17
HYCOM contains many programs that can be used to modify HYCOM archive files or convert
them to other file formats. These programs are located in the All/archive directory. Several of
these source routines are identical to the source files found in the All/plot directory, because
both sets of programs are doing similar archive processing. These are not hardlinked together,
so any modifications of a program in one directory must be manually propagated to the other
by the user.
Typically, all (non-netCDF) executables are created just once by editing Make all.src for
the correct ARCH and then issuing the command csh Make all.com (See Section 10, Configuring and Compiling the Model Code). Executables that use the netCDF library (version 3.5) are created just once by editing Make ncdf.com for the correct root directory in
NCDF and then issuing the command csh Make ncdf.com. The netCDF library is at:
http://www.unidata.ucar.edu/packages/netcdf/. These are optional, ignore Make ncdf.com
if you dont want to use NetCDF.
The following sections list the archive programs and describe their usage in HYCOM.
17.1
HYCOMARCHV
The program HYCOMARCHV converts a MICOM or HYCOM 1.0 archive file to HYCOM 2.0.
It can also be used to generate a new .b file corresponding to an .a HYCOM 2.0 archive file
(e.g., if the original .b file is corrupted). This is illustrated by the script archv 010 0021 016.com,
which creates a new archive file as *.[AB]:
../expt
../expt
../expt
../expt
01.0/data/archv.0021
01.0/data/archv.0021
01.0/data/archv.0021
01.0/data/archv.0021
016
016
016
016
00.A
00.B
00.a
00.b
In this case the original .b file is correct and so the .B is identical (except that diafx
fields are missing). If the original .b file is missing, then a dummy version must be provided
which has the correct time step and model day but can have the wrong minimum and maximum
values (e.g., Create this by editing any existing .b file with the same number of layers).
17.2
TRIM ARCHV
The program TRIM ARCHV will modify the number of layers in a HYCOM 2.0 archive file.
It is primarily used in the process of generating sub-region archive files for nested boundary
conditions (e.g., when the nested (sub-region) domain has a subset of the layers used by the
enclosing domain). Layers can only be added at the top of the water column (e.g., for converting
isopycnal cases to a hybrid vertical coordinate), or removed at the bottom of the water column
(e.g., to remove dense layers that dont exist in the sub-region). See Section 19.1.4.
17.3
17.3
MRGL ARCHV
47
MRGL ARCHV
The program MRGL ARCHV also modifies the number of layers in a HYCOM 2.0 archive file
by combining several layers into one layer. It is primarily used diagnostically (e.g., to plot a
water mass that consists of several layers).
17.4
ARCHV2RESTART
The program ARCHV2RESTART creates a HYCOM 2.X restart file from an archive file. Since
the archive file only contains one time level, this is duplicated to get the two time levels needed
for restart. In addition an example restart file is input to obtain the few fields that are not in
the archive file.
17.5
17.6
The programs ARCHV2DATA3Z and ARCHV2NCDF3Z extract diagnostic fields at fixed Zlevel depths in several file data formats. They are identical except that ARCHV2NCDF3Z
includes the option of outputting NetCDF files, and requires the NetCDF version 3.5 library. If
you dont need NetCDF, then use ARCHV2DATA3Z.
The program ARCHV2DATA3Z interpolates hybrid layers from an archive file to fixed
depths. Like HYCOMPROC, the output can be for a sub-region. Output can be formatted, unformatted (BINARY), or [.ab](HYCOM). The .ab files can be plotted directly using
FIELDPROC. Many plot packages can read fully raw binary files, but may not handle padded
arrays. Note that HYCOM .a files can be converted to raw files with an arbitrary data void
value using the HYCOM2RAW program in the bin subdirectory.
The data* 010 0021 016.com scripts are currently configured to write HYCOM .[ab] files and
to convert the .a file to a raw .A file.
48
17.7
17
ARCHM* Programs
The ARCHV* programs can read all HYCOM 1.0 and HYCOM 2.X archive files. There are also
corresponding ARCHM* programs that read MICOM files. For example, ARCHM2NCDF2D
and ARCHM2NCDF2Z provide a NetCDF capability for MICOM archive files. Since MICOM
is in CGS and uses a N-S then W-E grid orientation, the input is immediately rotated (to
HYCOMs W-E then S-N grid) and converted to MKS. Note that the bottom topography and
all the input parameters are always from HYCOM.
17.8
NetCDF Files
NetCDF files are self-describing and can be plotted and otherwise processed by a wide array
of packages. This is our recommended format for diagnostic files. The files conform to the
NetCDF Climate and Forecast (CF) metadata conventions, which were chosen because they
allow for curvilinear grids. This is a new convention that is not yet widely supported, but it is
an extension of the popualar COARDS conventions which means that many existing NetCDF
packages provide at least partial support.
49
18
Sampling Transport
18.1
Sampling Transport
In the ALL/sample subdirectory, source code for sampling transport across specified sections
from HYCOM archive files or mean archive files is contained. The following scripts are located
in the ATLb2.00/sample subdirectory:
010M020-020.com
010M020-020mn.com
010y020-020.com
010y020-020mn.com
link.com*
Script
Script
Script
Script
Script
to
to
to
to
to
The link.com script is typically edited for each new region and run just once to define softlinks to topography for that region.
18.1.1
BARO VEL
The program BARO VEL will extract barotropic (depth averaged) velocity at every point along
a list of sections from a sequence of archive files. The resulting plain text barotropic velocity
profiles can be plotted by many graphics packages, including gnuplot.
18.1.2
TRANSPORT
The program TRANSPORT is used in the script 010y020-020.com to sample the tranport
across a list of sections from a sequence of archive files. The sections are specified by end points
in p-grid array space: if, il, jf, jl with either if=il or jf=jl. Sections are therefore either
zonal or meridional. Transports are +ve for a net current from right to left when standing at
(if,jf) facing towards (il,jl). Note that max(if,il) can be greater than idm in periodic
(global) cases and max(jf,jl) can be greater than jdm in arctic dipole patch (global) cases. It
is possible to add several consecutive sections together by using special names for the sections:
@0 means skip this section, @+: means add to next section, and @- means subtract from
next section. The @+ and @- transports are carried over to the first following section that
does not have a name starting with @. These names are processed at the statistics production
phase, by MEANTSPT, but are defined and placed into the output sample transport file by
the transport program. This output file is plain text, and can be edited if statistics from a
different combination of sections is desired than originally specified. For example, @0 would
not normally be in the original sample file but can replace the name of sections that are not
desired in a particular set of statistics.
18.1.3
TRANSP MN
The program TRANSP MN is used in the script 010M020-020.com to sample the tranport
across a list of sections from a mean archive file. Only one mean archive file is input, but the
output is a transport sample file covering all the model days that made up the mean - each with
50
18
SAMPLING TRANSPORT
identical transport. This file has identical form to those produced by the transport program, but
because all times have identical transports the statistics (from MEANTSPT) will have accurate
means but zero variability.
18.1.4
TRANSP MN 2P0
The program TRANSP MN 2P0 will sample the tranport across a list of sections from a single
mean archive file generated by the HYCOM 2.0 MEANSTD program. Only the barotropic
transport (i.e., total transport across all layers) is sampled.
18.1.5
MERGETSPT
The program MERGETSPT will merge transport sample files that contain identical sections
but for different time periods.
18.1.6
MEANTSPT
The program MEANTSPT is used in 010[yM]020-020mn.com to produce mean and variability (zero variability from mean archives) statistics from a transport section data file generated by
transport or TRANSP MN (zero variability) or TRANSP MN 2P0 (mean total transport only).
As illustrated in the example scripts, it is possible to combine a set of consecutive layers in the
statistics (e.g., to start with 22 layers but write statistics for only five multi-layer combinations).
51
19
Nesting in HYCOM
The directory ALL/subregion contains domain-independent source code for the extraction of a
subregion from an archive file. The target can have the same grid resolution as the original, or
be finer than the original by an integer multiplier. The general case (non-integer grid refinement
and/or different grid orientation) is not yet supported. The following section details nesting at
a finer resolution using the example files for subregion IASb0.50 that are provided in HYCOM.
The files and scripts mentioned are located in directories ALL/subregion/src, ALL/topo/src,
ATLb2.00/subregion, and IASb0.50/topo. Section 19.2 explains nesting at the same horizontal
resolution, and example files for subregion IASd0.30 are provided to the user.
19.1
IASb0.50 is a sub-region of the Atlantic 2.00 degree domain (ATLb2.00) at four times higher
resolution, and illustrates how to nest a subregion in a larger HYCOM model region with different
horizontal resolution. This is off-line one-way nesting, using boundary conditions similar (but
not identical) to those already used for this purpose by MICOM.
19.1.1
Input Files
The IASb0.50 HYCOM model does not know about the ATLb2.00 domain. It expects a
sequence of IASb0.50 input archive files to supply the data needed for the boundary conditions.
In fact, there are two distinct sets of boundary conditions: relaxation to T/S/p in a buffer zone,
and application of depth averaged flow at the open boundary. Both are input from archive
files, however T/S/p is only available from full 3-D archive files and depth averaged flow is also
available from surface archive files. So archive input for depth averaged flow could be more
frequent than for T/S/p. Nesting in MICOM was similar, except that relaxation to velocity was
also used in the buffer zone and the relaxation e-folding time on p/vel was much shorter than
on T/S.
19.1.2
The nested domain must be finer than the original by an integer multiplier (ijgrd). In addition,
subregion p(1,1) and p(idm out,jdm out) must be on the original p-grid (i.e., idm out-1 and
jdm out-1 must be integer multiples of ijgrd). The general cases of non-integer grid refinement
or different grid orientation are not yet supported. Dont forget to allow for the fact that the
buffer zone (typically at least 10 fine grid points) should probably be outside the region of high
interest (i.e., make idm out and jdm out larger to allow for this).
Typically the subregions regional.grid.com script (located in the topo directory) will be
similar to that from the enclosing region. The program HYCOM IJ2LONLAT can be used to
find co-located points on the two grids. Since subregion p(1,1) must be on the original grid,
this is usually the point to reference. For example:
>hycom_ij2lonlat 1 1 ~/hycom/IASb0.50/topo/regional.grid.a
97.000W
3.997N
52
19
NESTING IN HYCOM
>hycom_ij2lonlat 1 13 ~/hycom/ATLb2.00/topo/regional.grid.a
97.000W
3.997N
Obviously, HYCOM IJ2LONLAT cant be used on the subregion until its regional.grid.[ab] has
been produced. But it can be used to identify the location on the enclosing grid that will become
the subregion p(1,1), and this can then be used as a guide to configure regional.grid.com for
the subregion.
19.1.3
It is advisable to make the sub-region bathymetry and coastline exactly consistent with the
coarser enclosing region, both on the open boundary and in the relaxation buffer zone. Everywhere else the bathymetry and coastline can be optimized for the higher resolution. For
example, to create the IASb0.50 bathymetry do the following:
1. Generate the grid bathymetry and coastline at the finest resolution possible for the entire
region. In this example, the bathymetry that is generated is depth IASb0.50 02.[ab].
2. Interpolate the coarse enclosing bathymetry to the nested region using the program ISUB
TOPOG. This program is called by the script depth IASb0.50 99.com. The script then
produces depth IASb0.50 99.[ab], which is further edited into depth IASb0.50 98.[ab].
3. Merge the two bathymetries (02,98) using the program TOPO MERGE, which selects
the coarse depths and coastline in the buffer zone, a combination near the buffer zone,
and the fine depths and coastline everywhere else. This program is called by the script
depth IASb0.50 03 merge.com, which produces the final bathymetry:
depth IASb0.50 03.[ab].
19.1.4
19.1
1.0
6.0
2
3
bnstfq
nestfq
lbflag
iniflg
=
=
=
=
53
Here, iniflg is also set to 3 since the initial restart will be from an archive file (see
below). The archive to restart conversion is done off-line, so subroutine INICON is never called
and iniflg is not used. The advantage of setting iniflg to 3 (versus the usual 2) is that this
may remove the need for relaxation file input or limit this input to surface fields only.
In addition, the location of barotropic boundaries must be specified in the file ports.input.
For example (from IASb0.50/expt 01.0/ports.input):
2
1
39
92
65
65
3
93
93
3
64
nports
kdport
ifport
ilport
jfport
jlport
kdport
ifport
ilport
jfport
jlport
=
=
=
=
=
=
=
=
=
=
=
This is for two open boundaries on the northern and eastern edges of the region rectangle.
Note that northern and southern boundaries are specified on the v-grid (since v-velocity is
normal to these boundaries), and eastern and western boundaries are specified on the u-grid.
Each boundary location is a grid point just outside the model region. Correctly positioned open
boundaries appear as *s on the iu and iv maps printed in the model run .log file. If the
boundary locations are not specified correctly the model will stop and the iu and/or iv maps
will contain 9s instead of *s at the locations that are in error. Some errors (e.g., boundaries
that are too short) cant be detected by HYCOM, so always check the iu and iv maps when
initially configuring a nested domain.
Note that the nesting buffer zone relaxation is completely independent of climatology buffer
zone relaxation. Both could be active in the same model run. Nesting barotropic boundary
conditions cannot be used in combination with port (lbflag=1), specified inflow/outflow forcing.
The very first run with the nested boundaries needs a restart file consistent with the enclosing
region. This is obtained from a sub-region archive file using the program RESTART ARCHV
located in directory ALL/restart. An existing restart file is required by this process. If none
is available, generate a climatology over the sub-region (just as for a closed domain) and run
for 1 day with closed boundaries and iniflg=2. Outside the nested buffer zone, the fine and
coarse bathymetry and coastline may be significantly different. This might cause problems on
54
19
NESTING IN HYCOM
restart. One option to ISUBREGION that might help in this case (but not needed normally)
is to smooth the layer thicknesses (smooth=1). Using a smaller time step for the very first run
might also allow it to accept a sub-optimal interpolated restart file. In general, it is a good
idea to make the fine and coarse coastlines as compatible as possible - which is most easily done
by always generating a fine reference coastline/bathymetry for the enclosing region and then
subsampling it to the desired resolution. This wont be possible when dealing with existing
bathymetries, but is the recommended way to produce new bathymetries.
19.2
IASd0.32 is a sub-region of ATLd0.32, and illustrates how to nest a subregion in a larger HYCOM
model region with the same horizontal resolution. Nesting at the same resolution, as here, is not
very interesting, but is a special case of nesting inside a coarser resolution region. Most of the
process is the same for any enclosing region. This is off-line one-way nesting, using boundary
conditions similar to those already used for this purpose by MICOM.
As in IASb0.50, the IASd0.32 HYCOM model does not know about the ATLd0.32 domain.
It expects a sequence of IASd0.32 input archive files to supply the data needed for the boundary
conditions. These conditions are the same as IASb0.50, explained in Section 19.1.1.
To generate IASd0.32 archive files from ATLd0.32, do the following:
1. Use the program SUBREGION to extract a subregion from a full region HYCOM 2.0
archive file, but note that the result for 3-D archives will have 26-layers. For example, see
ATLd0.32/subregion/081y010.com.
2. Use TRIM ARCHV to reduce from 26 to 22 layers. For example, see IASd0.32/plot/ATLd0.32 081y010.com
Once the archive files have been generated, the procedure is the same as for nesting HYCOM
at a finer horizontal resolution (Section 19.1.4). The archive files are used as boundary conditions, and the location of the barotropic boundaries must be specified in the file ports.input.
The first run of the model uses a restart file which is obtained from an archive restart file using
the program RESTART ARCHV.
55
20
Parallel Processing
20.1
To configure the run script, there are two environment variables, NOMP and NMPI, that must be
set depending on the type of processing being used. NOMP is the number of OpenMP threads, and
NMPI is the number of MPI tasks. NOMP should be set to 0 for no OpenMP, or 1 for interactive
OpenMP. NMPI should be set to 0 for no MPI. These variables are explicitly set near the top
of the script, but note that this explicit setting might be modified based on batch limits. The
script is currently configured to do this for LSF, Codine, and GRD batch systems. Explicit 0
values are preserved, and when both NOMP and NMPI are non-zero the NOMP value is preserved
(i.e., NMPI is modified to conform to the batch limit).
When running on a single processor (TYPE=one), set NOMP and NMPI to 0. When using
OpenMP alone (TYPE=omp), set NMPI to 0 and NOMP to the number of OpenMP threads (i.e.
number of shared memory processors used). When using SHMEM, set NOMP to 0 and NMPI to
the number of SHMEM tasks. When using MPI alone, set NOMP to 0 and NMPI to the number
of MPI tasks. When using MPI and OpenMP, set both NOMP and NMPI above 1, and the total
number of processors is then $NOMP times $NMPI. Also, be careful to ensure that if a node in
the cluster runs (say) N MPI tasks it has at least N$NOMP processors. For example, an IBM
SP WinterHawk II has 4 processors per node so it can run up 4 MPI tasks per node without
OpenMP (NOMP=0 or 1), or up to 2 MPI tasks per node with NOMP=2, or 1 MPI task per node
with NOMP=3 or 4.
The model run script assumes that all data files are on a globally accessible shared file system.
Some low-cost MPI-based systems (e.g., Beowulf Clusters) do not have a shared file system that
is accessible by all nodes. In such cases the script must be modified to use a local file system
on each node. When doing this, note that .a and .b files are both read and written on the
56
20
PARALLEL PROCESSING
very first MPI task only. The script will typically run on the same node as the first MPI task,
which means that all .a and .b files are probably already handled correctly for local disks.
However, the .input files may need to be broadcast to all nodes (e.g., by rcp) as part of the
script.
20.2
patch.input
When using MPI or SHMEM, an additional file, patch.input, is required to control the domain
decomposition into tiles. This is assumed to be located at ../topo/partit/depth * xxx where
xxx is $NMPI as a 3-digit number.
20.3
When generating equal-ocean tile partitions for use with MPI parallelization, all scripts will
require renaming and/or editing for a different region or bathymetry. The following steps show
how to generate the tile partitions:
1. Edit depth ATLa2.00 01 2d.com to contain the desired numbers of processors.
2. Run using the command:
csh depth ATLa2.00 01 2d.com >& depth ATLa2.00 01 2d.log
This will create the partition text files: depth ATLa2.00 01.xxx.
3. To view the partitions, generate .ppm bitmaps with csh ppm.com and display them
using X-Windows Viewer (XV) (or another bitmap viewer). Note that xbathy.pal must
be present for this to work.
4. Generate a list of partition statistics with csh size.com. This produces size.lis, for
example:
npes
4
8
9
16
npe
2
4
3
4
mpe
2
2
3
4
idm
57
57
57
57
jdm
52
52
52
52
ibig
34
24
33
34
jbig
26
26
19
15
nreg
0
0
0
0
minsea
408
196
168
94
maxsea
439
222
195
110
5. The maximum values in the four columns npe, mpe, ibig, and jbig should be entered
in src * mpi/dimensions.h as iqr, jqr, idm, and jdm. This allows any of the partitions
to be used with the same executable.
20.4
Comparing Runs
20.4
20.4.1
57
Comparing Runs
Pipe.f
The source code in pipe.f controls the use of named pipes to compare two identical runs, the
master vs the slave. These typically differ only in the number of processors used (i.e., in the
values of NOMP and NMPI). Often the master is on a single processor(i.e., NMPI=0 and NOMP=0 or
1). The comparison is made for every element of every array sent over the named pipe from the
slave to the master. By default, most of the significant model arrays are compared after each
major phase of each time step. If an error (i.e., a difference between master and slave runs) is
detected, additional calls to COMPARALL or COMPARE can be added in the subroutine that
introduced the error to find out exactly which OpenMP loop needs modifying. Named pipes
can be difficult to use from Fortran. The current open statement has worked on all machines
tried so far, but it might require modification on a new machine.
The first task of each run must be on the same node (i.e., the same O/S image), for the
named pipe to work. This will almost always be the case when comparing two OpenMP runs,
but can be harder to arrange for MPI runs. The easiest MPI-based named pipe comparisons to
make are with a single processor master without MPI, but because this involves two different
executables make sure they were created with identical compiler options (typically TYPE=mpi
with TYPE=one and TYPE=ompi with TYPE=omp).
The script 024y001T pipe.com illustrates how to configure an OpenMP test run. It creates
a named pipe and two separate data directories, dataT01 and dataT03. The named pipe is linked
to PIPE MASTER in dataT01 and to PIPE SLAVE in dataT03. The existence of these filenames
switches on the named pipe comparison. Finally the two twin runs scripts, 024y001T01.com
and 024y001T03.com, are run in the background and the job waits for both to end. If no
errors are detected, 024y001T pipe.com will end normally. However, if the two runs do not
produce exactly identical results the master will terminate but the slave will hang and must
be killed manually. The location of the difference will be at the end of filename PIPE base.out
in the master scratch dataT01 directory.
The two runs scripts are identical except for the data directory used and the number of
threads used. They are almost identical to a standard runs script, 024.com, except that they
use 024y001T.limits as their limits file and do not copy their results back to the permanent
directory.
If the file PIPE DEBUG exists in the scratch data directory, it switches on single point (at
itest,jtest) diagnostic printout to the .log file from every call to COMPARALL. This can be
used with or without the named pipe comparison. It is simply using the name-pipe subroutine
interface for diagnostic printout. The standard run script includes the following lines that can
be uncommented to turn on this capability: #C #C turn on detailed debugging. #C #touch
PIPE DEBUG
58
21
21
The experiment 01.5 demonstration run in the directory ATLb2.00/expt 01.5 is configured for
a North Atlantic domain with 2-degree horizontal resolution and 16 coordinate surfaces in the
vertical. Forcing is from COADS, plus relaxation to Levitus climatology in boundary zones, and
relaxation to Levitus surface salinity. Initialization is to summer Levitus climatological interface
pressures, salinity, and temperature. The mixed-layer formulation is KPP.
In order to run this example, and later on modify the code for specific applications, the user
must do the following (assuming a single processor, i.e., no parallelization):
1. Compile hycom in the ATLb2.00/src 2.1.03 22 one directory, with the command:
./Make.com >& Make.log
Note that this compilation is for exactly 22 layers. A 26-layer HYCOM would be compiled
in a different source directory, ATLb2.00/src 2.1.03 26 one, with kdm=26 in dimensions.h.
2. If input and output files are to reside on the same machine as that from which the model
is run, modify the script 015.com located in the directory ATLb2.00/expt 01.5 to replace
pput and pget by cp in the lines corresponding to your operating system. (See Section 2
for more information.)
3. Modify the script 015.com to set P as the primary path (default is ./data subdirectory),
to set D as the data subdirectory (default sets D to P), and to set S as the scratch directory
(machine-dependent). If you only have one filesystem on one machine, set S to $D/WORK
(as an example), so that the data and scratch directories are distinct.
4. Create or edit LIST to include the sequence of model years to run. For example,
../bin/mlist 1 5 1 will create a LIST file to run the first five years as five one year
runs.
5. Defining a special 015y001.limits file allows the run to start in the summer of the first
year. Note that the start date in 015y001.limits should be -180.0, where 0 or -ve values
indicate an initial run (rather than a restart).
6. Submit the demorun by issuing ../bin/msub 015nqs.com 01 where the appropriate
015xxx.com batch script should be used and the appropriate variant of msub for the local
batch system should first be made the default via a softlink. Note that msub csh is for
running without a batch system, as a background interactive job, and this works with all
variants of 015xxx.com.
(Inclusion of the hycom/ALL/bin directory in the run scripts command path does away
with the need to specify the full path for any command in that directory. If this directory is
not included in the run scripts command path, spurious error messages may be generated
by use of the null command C as a comment indicator. See Appendix A.)
7. The output files will be in the permanent data subdirectory D defined in 015.com. Note
that this may be on a different machine, depending on how pput and pget are defined.
59
For more information on changing run scripts and model configuration, see Sections 12.2,
and 11.
60
22
22
The following information is a summary of the steps to take for setting up HYCOM for a new
region. These steps have already been incorporated into the previous sections of this document, however, this list presents the information in a concise outline format. All pre- and
post-processing programs are now region-independent, but scripts typically still need editing for
each new region (e.g., to include the region name and the bathymetry version).
To set up HYCOM for a new stand-alone region:
1. Pick a region name, XXXaN.NN (e.g., IASb0.50, ATLb2.00, ATLd0.32). XXX is an
uppercase three letter primary region name, a is a lowercase one letter secondary region
name, and N.NN is a three digit grid resolution description.
2. Create XXXaN.NN/topo/regional.grid.[ab] files that describe the location of the region
and grid.
3. In XXXaN.NN/topo, generate a bathymetry and a landsea mask.
4. In XXXaN.NN/force, interpolate atmospheric forcing fields to this region.
5. In XXXaN.NN/expt 01.0, choose a vertical structure and implement it in the blkdat.input
file.
6. In XXXaN.NN/relax/levitus, interpolate Levitus climatology to this region and bathymetry
(still on the Levitus z-levels).
7. In XXXaN.NN/relax/010, interpolate Levitus onto the vertical structure chosen in the
experiments blkdat.input file. Region specific information is in EXPT.src.
8. In XXXaN.NN/src 2.1.03 MM one (where MM is the number of layers), edit dimensions.h
for this domain and number of layers and run Make.com. For multi-cpu runs, replace
one with the parallelization type.
9. In XXXaN.NN/expt 01.0, edit scripts as needed and run the simulation.
There are several scripts that aid in migrating region-specific scripts to a new region. These
scripts include new force.com in the ATLb2.00/force subdirectory, new topo.com in the
ATLb2.00/topo subdirectory, new topo.com in the ATLb2.00/topo/partit subdirectory, and
new expt.com in the ATLb2.00/expt 01.5 subdirectory.
61
23
23.1
TECHNICAL REFERENCES
HYCOM Software Documentation
Wallcraft, A., Halliwell, G., Bleck, R., Carroll, S., Kelly, K., Rushing, K., (2002). Hybrid
Coordinate Ocean Model (HYCOM) Users Manual: Details of the numerical code. Technical Report #. . .
23.2
Bleck, R., (2001). An oceanic general circulation model framed in hybrid isopycnic-Cartesian
coordinates. Ocean Modeling, 4: 55-88.
Canuto, V.M., (2000). Ocean Turbulence: A model with shear stratification and salinity.
NASA Goddard Institute for Space Studies, Unpublished Manuscript.
Halliwell Jr., G.R., Bleck, R., and Chassignet, E.P., (1998). Atlantic ocean simulations performed using a new Hybrid Coordinate Ocean Model (HYCOM). EOS, Fall AGU Meeting.
Halliwell Jr., G.R., Bleck, R., Chassignet, E.P., and Smith, L.T., (2000). Mixed layer model validation in Atlantic ocean simulations using the Hybrid Coordinate Ocean Model (HYCOM).
EOS, 80, OS304.
Large, W.G., McWilliams, J.C. and Doney, S.C., (1994). Oceanic vertical mixing: A review and
a model with a nonlocal boundary layer parameterization. Rev. Geophys., 32: 363-403.
Large, W.G., Danabasoglu, G., Doney, S.C., and McWilliams, J.C., (1997). Sensitivity to surface forcing and boundary layer mixing in a global ocean model: Annual-mean climatology.
J. Phys. Oceanogr., 27: 2418-2447.
Levitus, S., (1982). Climatological atlas of the world ocean. NOAA/ERL GFDL Professional
Paper 13, Princeton, N.J., 173 pp. (NTIS PB83-184093).
Mellor, G.L. and Yamada, T., (1982). Development of a turbulence closure model for geophysical fluid problems. Rev. Geophys. Space Phys., 20: 851-875.
Mellor, G.L., (1998). Users guide for a three dimensional, primitive equation numerical ocean
model. AOS Program Report, Princeton University, Princeton, NJ. 34 pp.
Price, J.F., Weller, R.A., and Pinkel, R., (1986). Diurnal cycling: Observations and models of
the upper ocean response to diurnal heating, cooling and wind mixing. J. Geophys. Res.,
91: 8411-8427.
62
24
24
Acronyms
COADS
CODINE
CPP
cp
ECMWF
ETOPO2
GPFS
GRD
HYCOM
IAS
IBM SMP
I/O
KPP
LEVITUS
LSF
MICOM
MKS
MLB
MPI
NetCDF
NCAR
NCARGF77
NFS
NGDC
NQS
NQE
NRL
PBS
PE
RCP
RMS
S
SGI
SHMEM
T
XV
ACRONYMS
63
25
25.1
APPENDIX A
HYCOM Utility Commands
The following table lists HYCOM utility commands and their function, and gives the filename
where the source code is found. These commands are located in the ALL/bin subdirectory. For
commands without manual pages, the header of the script or the source code contains usage
information. Also, invoking the command with no arguments will print a single line usage
message.
On a new machine type, the script Make all.com should be run to recompile all the *.[Ffc]
source codes to create executables ending in machinetype, where machinetype is typically
the output of uname, which are softlinked to the standard executable name. The c-shell scripts
clim stat, wind stat and hycom sigma invoke * machinetype using a hardwired path. The
path, and possibly the machinetype definition may need modifing for your particular setup. The
gnuplot plot package is also used by hycom sigma and its location must be specified. This can
all be achieved by invoking the command csh Make all.com. It will warn you if the c-shell
scripts need modifying. The script Make clean.com will remove all machine specific executables,
but should only typically be required when updating to a new compiler version.
Description
clim stat
clim stat.1
clim stat.f
echo2
echo2.c
hycom2raw
hycom2raw.F
hycom alat
hycom alat.f
hycom depth
hycom depth.f
hycom expr
hycom expr.F
64
25
APPENDIX A
Description
hycom ij2lonlat
hycom ij2lonlat.F
hycom lonlat2ij
hycom lonlat2ij.F
hycom mxthrd
hycom mxthrd.F
hycom print
hycom print.F
hycom profile
hycom profile.F
hycom profile2z
hycom profile2z.F
hycom range
hycom range.F
hycom range ij
hycom range ij.F
hycom rivers
hycom rivers.F
hycom rivers.d
hycom sea ok
hycom sea ok.F
hycom shift
25.1
65
Description
Source code for hycom shift machinetype.
hycom sigma
hycom sigma.f
hycom sigma.gnu
hycom yoflat
hycom yoflat.f
hycom zonal
hycom zonal.F
mdel
mlist
msub
msub codine
msub csh
msub grd
msub ll
msub lsf
msub nqs
msub pbs
ncargf77
NCAR Graphics f77 wrapper (softlink to script).
ncargf77.4.1.1 SunOS Version 4.1.1 script for Solaris.
ncargf90
NCAR Graphics f90 wrapper (softlink to script).
ncargf90.4.1.1 SunOS Version 4.1.1 script for Solaris.
pget
pget rcp
pput
pput rcp
Get
Get
Put
Put
one
one
one
one
file
file
file
file
wind stat
wind stat.1
wind stat.f
66
25
Description
Source code for wind stat t3e.
APPENDIX A
67
26
APPENDIX B
26.1
Description
iversn
iexpt
mapflg
idm
pntlon
reflon
grdlon
jdm
pntlat
reflat
grdlat
itest
jtest
kdm
nhybrd
nsigma
dp00s
dp00
dp00x
dp00f
saln0
kapflg
thflag
thbase
sigma
iniflg
jerlv0
yrflag
dsurfq
diagfq
rstrfq
baclin
batrop
hybflg
advflg
68
26
APPENDIX B
Description
= +1 for free-slip, -1 for non-slip boundary conditions.
Fraction of diffusion that is biharmonic (0.0 to 1.0).
Deformation-dependent viscosity factor (nondimensional).
Diffusion velocity (m/s) for momentum dissipation.
Diffusion velocity (m/s) for thickness diffusion.
Diffusion velocity (m/s) for temperature/salinity diffusion.
Diffusion velocity (m/s) for momentum at MICOM Mixed Layer
Base (MLB).
Root mean-square (RMS) flow speed (m/s) for linear bottom friction.
Coefficient of quadratic bottom friction.
Thickness of bottom boundary layer (m).
Minimum density jump across interfaces (kg/m 3 ).
Equivalent temperature jump across mixed-layer (degrees C).
Minimum mixed-layer thickness (m).
Ice model flag (0=none, 1=energy loan model).
Mixed layer flag (0=none, 1=KPP, 2=KTa, 3=KTb).
KT: Activate penetrating solar radiation(0=F,1=T).
KT: Diapycnal mixing flag (0=none, 1=KPP, 2=explicit).
KT: Number of time steps between diapycnal mixing calculations.
KT: Diapycnal diffusivity x buoyancy frequency (m2 /s2 ).
KT: Maximum permitted mixed layer detrainment rate (m/day).
KPP: Activate shear instability mixing (0=F, 1=T).
KPP: Activate double diffusion mixing (0=F, 1=T).
KPP: Activate nonlocal bottom layer mixing (0=F, 1=T).
KPP: Activate horizontal smooth diffusivity coefficients (0=F, 1=T).
KPP: Value for calculating rshear instability.
KPP: Maximum viscosity due to shear instability (m2 /s).
KPP: Maximum diffusivity due to shear instability (m2 /s).
KPP: Background/internal wave viscosity (m2 /s).
KPP: Background/internal wave diffusivity (m2 /s).
KPP: Salt fingering diffusivity factor (m2 /s).
KPP: Salt fingering rp=(alpha*delT)/(beta*delS).
KPP: Critical bulk Richardson number.
KPP: Value for nonlocal flux term.
KPP: Value for nonlocal flux term.
KPP: Value for turbulent shear contribution to bulk Richardson
number.
KPP: Value for turbulent velocity scale.
26.1
Description
KPP: Iterations for semi-implicit solution. (2 recommended).
Climatology frequency flag (6=bimonthly, 12=monthly).
Lateral barotropic boundary flag (0=none, 1=port, 2=input).
Wind stress input flag (0=none, 1=u/v-grid, 2=p-grid).
Thermal forcing flag (0=none, 1=origin, 2=new-flux-calculations).
Activate lateral boundary nudging (0=F, 1=T).
Activate surface salinity nudging (0=F, 1=T).
Activate surface temperature nudging (0=F, 1=T).
69
70
27
27.1
27
APPENDIX C
Appendix C
Sample Input File for Plotting - 990 cs2.IN
27.1
71