-
Notifications
You must be signed in to change notification settings - Fork 17
Creating a new MITgcm domain
This page describes how to create a new MITgcm domain and all its components:
- The grid (including interpolation of topography and ice shelf draft)
- Initial conditions (including ice shelf pressure load anomaly)
- Open boundary conditions
- Iceberg meltwater (optional)
The code creates polar spherical grids (i.e. regular in lat-lon space), with constant resolution scaled by cos of latitude. It is designed for realistic Antarctic domains including ice shelf cavities, which don’t do anything too weird with open boundary conditions. It has not been tested for domains with a periodic boundary (e.g. circumpolar domains).
The code is quite efficient; for a quarter-degree Weddell Sea domain (460 x 374 x 66 points), it took me about 45 minutes to run everything from start to finish. However, it might take you longer to make decisions about what you want, get a hold of datasets, etc.
Some instructions (eg file paths to datasets) are only relevant to BAS staff. Some instructions also assume you are running MITgcm with the workflow used by most people at BAS (e.g. all files in input/
are symlinked into run/
).
To build the domain, you will need to clone the mitgcm_python repository and ideally be able to access it from any directory on your machine.
First, choose a directory to install the code in. I'll use $HOME/python_code
as an example here. On the command line, cd
into this directory.
To download the code,
git clone https://github.com/knaughten/mitgcm_python.git
This will make a directory called mitgcm_python
. Note that this repository relies on some python tools distributed with MITgcm, which will need to be in your PYTHONPATH
. It is also useful to add the mitgcm_python code itself to your PYTHONPATH
so you can access it from any directory. So add this line to the bottom of your ~/.bashrc
:
export PYTHONPATH=$PYTHONPATH:$HOME/python_code:$ROOTDIR/utils/python/MITgcmutils
where $ROOTDIR
is the path to your copy of the MITgcm source code distribution.
Then either log out and log in again, or type source ~/.bashrc
. To make sure it worked, type python
or even better ipython
, and try these commands:
import mitgcm_python
import MITgcmutils
If there are no error messages, everything imported successfully.
For this section, you only need one module from the repository. Every time you launch ipython (or python) during this section, type this line:
from mitgcm_python.make_domain import *
In ipython,
lon, lat = latlon_points(west_boundary, east_boundary, south_boundary, north_boundary, resolution, delY_filename)
For example, a grid from 85°W to 30°E, 84.2°S to 61°S, with 0.25° resolution:
lon, lat = latlon_points(-85, 30, -84.2, -61, 0.25, 'delY_WSB')
The script will print the values of Nx
, Ny
, and their factors. You want Nx
and Ny
to have some factors in the 15-30 range, as these tend to be the most efficient tile sizes. If Nx
and/or Ny
don't have these factors, tweak your boundaries and try again until you're happy. Here is a nice looking result:
Nx = 460 which has the factors [1, 2, 4, 5, 10, 20, 23, 46, 92, 115, 230, 460]
Ny = 374 which has the factors [1, 2, 11, 17, 22, 34, 187, 374]
Make a note of the values of xgOrigin
, ygOrigin
, and dxSpacing
, which are also printed by the script, as you will need them later. Note that these variables come from the input arguments to latlon_points
: xgOrigin
is your western boundary mod 360 so it's in the range 0-360, ygOrigin
is your southern boundary, and dxSpacing
is your resolution.
Without exiting ipython (so the variables lon
and lat
are still in memory),
interp_bedmap2(lon, lat, topo_dir, nc_outfile, bed_file=bed_file)
where
-
topo_dir
: path to directory containing the necessary BEDMAP2 binary files. Look at the function definition inmake_domain.py
to figure out what filenames you need; everything is on Scihub in the directory/data/oceans_input/raw_input_data/bedmap2/bedmap2_bin/
. If your domain goes north of 60°S, you will also need the GEBCO bathymetry file in this directory (/data/oceans_input/raw_input_data/GEBCO/GEBCO_2014_2D.nc
on Scihub). -
nc_outfile
: filename for a NetCDF file that the script will create and save the temporary fields in. -
bed_file
: optional keyword argument. If you have an alternate BEDMAP2 bathymetry file (say with some updates by Sebastian Rosier in the Filchner), setbed_file
to its file path. Otherwise, leave it out and the code will use the original BEDMAP2 file withintopo_dir
.
For example,
interp_bedmap2(lon, lat, '../topo_data/', 'WSB.nc', bed_file='bedmap2_bed_seb.flt')
You might want to edit the land mask of your new domain. For example, Weddell Sea domains will typically fill everything west of the Antarctic Peninsula with land, and possibly extend the peninsula northwards, so the western boundary of the domain is closed. It can also be helpful to block out little islands and single-cell channels near the peninsula.
If you need to do something like this, take a look at the function edit_mask
in make_domain.py
. Create a new value for key
(an identifier string for your domain, such as 'WSB'
) and code up a new case for the middle section of the function. It will help to look at the NetCDF file output by interp_bedmap2
in something like ferret or ncview, and figure out exactly which sections you need to fill with land.
There are a few utility functions you can use, all stored in utils.py:
-
mask_box
: fill a box, defined by lat and lon boundaries, with land -
mask_iceshelf_box
: only fill ice shelf points in the given box with land -
mask_above_line
: fill everything north of a line segment, defined by two points, with land -
mask_below_line
: fill everything south with land
When you're ready, run the script in ipython:
edit_mask(nc_infile, nc_outfile, key=key)
For example,
edit_mask('WSB.nc', 'WSB_edited.nc', key='WSB')
When you ran interp_bedmap2
, it will have printed the deepest bathymetry in your domain. If you ran edit_mask
, it will have also printed this value, in case it changed. Make sure that the sum of your vertical layer thicknesses is slightly greater than this value. Put them in a plain text file, one value per line (positive, in metres), with no commas.
Now we need to address two problems which can arise from ice shelves and z-levels. First, it is sometimes necessary to dig (artificially deepen) the bathymetry in regions with very thin water columns, so that all ocean cells are properly connected in accordance with the BEDMAP2 mask, and the grounding line locations are preserved. Otherwise, subglacial lakes can form with sections of the domain being artificially disconnected. Second, very thin ice shelf drafts should be removed, otherwise this triggers a nasty masking bug in MITgcm (as of July 2018 it is about to be fixed in the master code). To complete both of these steps, run:
remove_grid_problems(nc_infile, nc_outfile, dz_file, hFacMin=hFacMin, hFacMinDr=hFacMinDr)
where
-
nc_infile
: NetCDF grid file after you edited the mask -
dz_file
: plain text file with your vertical layer thicknesses -
hFacMin
andhFacMinDr
: agree with the values in yourinput/data
namelist to MITgcm For example,
remove_grid_problems('WSB_edited.nc', 'WSB_final.nc', 'dz_vals', hFacMin=0.1, hFacMinDr=20.)
Now it's time to put the bathymetry and ice shelf draft fields in the binary format for reading into MITgcm:
write_topo_files(nc_file, bathy_file, draft_file)
For example,
write_topo_files('WSB_final.nc', 'bathy_WSB', 'draft_WSB')
Section &PARM01
:
- Make sure that
readBinaryPrec=64
, because the three binary files we just wrote are 64-bit not 32-bit. - If you're also using xmitgcm to convert your binary output to NetCDF, temporarily change
debugLevel
to 1 or higher so you can generate a newavailable_diagnostics.log
reflecting the new number of vertical layers.
Section &PARM04
:
- Update
xgOrigin
,ygOrigin
, anddxSpacing
based on the values printed bylatlon_points
. - Set
delYfile
to the file output bylatlon_points
(eg'delY_WSB'
). -
delR
should be a list of your vertical layer thicknesses.
Section &PARM05
:
- Set
bathyFile
to the bathymetry file output bywrite_topo_files
(eg'bathy_WSB'
).
You'll probably also have to update the number of vertical layers in a few variables which have the form (number of layers)*constant. For my configuration I needed to change tRef
and sRef
(both in section &PARM01
), but your configuration might have other such variables, or maybe even none.
Set SHELFICEtopoFile
to the ice shelf draft file output by write_topo_files
(eg 'draft_WSB'
).
Set SHELFICEloadAnomalyFile
to some temporary filename (eg 'pload_tmp'
) - more on this later.
If useMNC=.true.
, set it to .false.
. False is the default so if there is no line about MNC at all, you're fine.
In section &OBCS_PARM01
, update the values of OB_Iwest
, OB_Ieast
, OB_Jsouth
, and OB_Jnorth
as needed to reflect the new values of Ny
(for OB_Iwest
and OB_Ieast
) and Nx
(for OB_Jsouth
and OB_Jnorth
). They will each be in the form (Nx or Ny)*(constant such as 1 or -1).
Update the values of Nz
(number of vertical layers), sNx
, sNy
(tile dimensions) and nPx
, nPy
(number of tiles in each dimension) such that sNx*nPx=Nx
, sNy*nPy=Ny
, and sNx
, sNy
are both around 15-30. Later, you can do proper benchmarking to figure out what is the optimum tile size, but it will likely be around this range.
Your total number of processors is nPx*nPy
; make sure your aprun
commands and PBS job scripts reflect this.
Copy the files pointed to by delYfile
, bathyFile
, and SHELFICEtopoFile
into the input/
directory. Within this directory, make a copy of SHELFICEtopoFile
to the temporary filename pointed to by SHELFICEloadAnomalyFile.
Finally, recompile the model to apply the changes in SIZE.h
.
Run the model just long enough to output binary grid files, which have useful derived variables such as partial cell fractions. We will need these files later, for things like interpolating initial conditions. Note that before these grid files are written, each package within MITgcm does some consistency checks. We need to fool the model to get past this stage. This is the purpose of the temporary SHELFICEloadAnomalyFile
, just so MITgcm doesn't die because no such file exists or has the wrong dimensions. You might also have to switch some packages off depending on your configuration and MITgcm version.
The model will die at some point during initialisation, probably when it can't find the initial conditions. But if you've fooled it enough, it will have written the following files into the run/
directory:
Depth.data DRF.meta hFacC.data hFacW.meta RF.data XG.meta
Depth.meta DXG.data hFacC.meta RAC.data RF.meta YC.data
DRC.data DXG.meta hFacS.data RAC.meta XC.data YC.meta
DRC.meta DYG.data hFacS.meta RC.data XC.meta YG.data
DRF.data DYG.meta hFacW.data RC.meta XG.data YG.meta
Copy these files into their own special directory outside of run/
(I like to use grid/
inside the case directory, i.e. on the same level as run/
).
Make sure the digging worked, according to the partial cells generated by MITgcm. In ipython,
check_final_grid(grid_path)
where grid_path
is the path to the directory with the binary grid files you created in the previous step. For example,
check_final_grid('../grid/')
If all is well, the code will print a message confirming that everything worked.
Assuming this is the case, you're done making the grid! Now undo any weird changes you made to the model:
- Delete the temporary
SHELFICEloadAnomalyFile
(to make sure you don't accidentally use it later!) - If you're using xmitgcm, copy
available_diagnostics.log
fromrun/
intoinput/
, and ininput/data
putdebugLevel
back to its original value.
This section interpolates initial conditions for temperature, salinity, sea ice area, and sea ice thickness from the SOSE Southern Ocean reanalysis. This may or may not be the dataset you want for your configuration. If not, hopefully this code will be a good starting point which you can modify to use with another product. Here we use the January climatology for all fields, as it's probably a safe assumption that your simulations start on 1 January.
SOSE is quite good in the open Southern Ocean, but can have some artifacts on the Antarctic continental shelf due to lack of data, and in particular it has no ice shelf cavities. The strategy used by this code is to set constant values for temperature and salinity in ice shelf cavities. Good values, depending on your region, might be -1.9°C (warm enough to cause some ice shelf melting so the cavities flush out quickly, but not ridiculously warm) and 34.4 psu (light enough to prevent any pooling of dense water).
MITgcm also needs the initial pressure loading anomaly in ice shelf cavities. This is equal to the weight of water displaced by the ice shelf, with respect to the reference density. This code assumes that the water displaced would have the same constant temperature and salinity as the initial conditions in the cavities. Note that this pressure loading is not the same as the actual weight of the ice shelf, because the observed ice shelf mass is in dynamic equilibrium with the real conditions under the cavities (i.e. non-constant temperature and salinity which are probably colder and denser than the values you set, and nonzero velocities). If you try to use ice shelf mass for the pressure loading anomaly in a simulation which is not spun up, it will probably blow up.
In this section, whenever you open ipython, load the correct functions like this:
from mitgcm_python.ics_obcs import *
Get a hold of the SOSE monthly climatologies for temperature (THETA
), salinity (SALT
), sea ice area (SIarea
), and sea ice thickness (SIheff
). I have pre-computed these from the full monthly fields (using the function make_sose_climatology
in ics_obcs.py
, in case you want to add a field) and saved them on Scihub, in the directory /data/oceans_input/raw_input_data/SOSE_monthly_climatology
. Just copy the entire directory to whichever server or machine you're using.
If you recompute the climatologies yourself for whatever reason, make sure the filenames match (THETA_climatology.data
, etc.) or else change the filename patterns in the function sose_ics
(in ics_obcs.py
).
Also make sure you have the subdirectory grid/
which includes the binary grid files for SOSE. (The Matlab grid files in the official SOSE distribution can't be read into python.)
In ipython,
sose_ics(grid_file, sose_dir, output_dir, nc_out=nc_out, constant_t=constant_t, constant_s=constant_s, split=split)
where
-
grid_file
: path to the NetCDF grid file you created for your new domain -
sose_dir
: directory containing your SOSE monthly climatologies (andgrid/
subdirectory) -
output_dir
: directory you would like the MITgcm initial conditions files (binary) to be written into
with optional keyword arguments:
-
nc_out
: path to a NetCDF file which the initial conditions will also be written into, so you can check them quickly in Ferret or ncview and make sure everything looks okay. -
constant_t
,constant_s
: temperature (°C) and salinity (psu) to fill ice shelf cavities with. Defaults are-1.9
and34.4
. -
split
:180
or0
, indicating the longitude you would like to split the SOSE grid at. If your domain contains 0°E, setsplit=180
(default). If your domain contains 180°E, setsplit=0.
If you have a circumpolar domain, try either and hope for the best - it's possible you will have points in the gap between SOSE's periodic boundary and you will have to write a patch to wrap the SOSE data around that boundary.
For example,
sose_ics('../mitgcm/cases/WSB_001/input/grid.glob.nc', '../SOSE_monthly_climatology/', '../mitgcm/cases/WSB_001/input/', nc_out='sose_ics.nc', constant_t=-1.9, constant_s=34.4, split=180)
Take a look at the NetCDF file to make sure nothing crazy happened.
calc_load_anomaly(grid_file, mitgcm_code_path, out_file, constant_t=constant_t, constant_s=constant_s, rhoConst=rhoConst)
where
-
mitgcm_code_path
: path to your copy of the MITgcm source code distribution (so we can use the official python code for the MDJWF density calculation) -
out_file
: path to your desired binary output file -
rhoConst
: matches the value indata/input
For example,
calc_load_anomaly('../mitgcm/cases/WSB_001/input/grid.glob.nc', '../mitgcm/MITgcm/', '../mitgcm/cases/WSB_001/input/pload_WSB', constant_t=-1.9, constant_s=34.4, rhoConst=1035)
The SOSE initial conditions files will be called THETA_SOSE.ini
, etc. Rename these if you like, and move them into your input/
folder if they weren't written directly in there. Also move the pressure loading anomaly file to input/
.
Section &PARM05
:
- Set
hydrogThetaFile='THETA_SOSE.ini'
andhydrogSaltFile='SALT_SOSE.ini'
(or whatever you renamed them to).
Section &PARM01
:
- Make sure that
eosType='MDJWF'
.
Set AreaFile='SIarea_SOSE.ini'
and HeffFile='SIheff_SOSE.ini'
.
Set SHELFICEloadAnomalyFile
to the filename output by calc_load_anomaly
.
This section interpolates open boundary conditions (OBCS) from the SOSE monthly climatology, for the following variables: temperature, salinity, horizontal ocean velocities, sea surface height, sea ice area and thickness, sea ice velocities. As in the last section, if this isn't the right dataset for your needs, the code will hopefully be a useful starting point.
Your domain can have up to four open boundaries (north, south, east, west); in many cases it will just have one or two, as the other boundaries are closed (completely land). Make sure that none of the open boundaries include ice shelf cavities - you might have to sacrifice a small ice shelf cavity and fill it with land during the edit_mask
step of grid generation (the WSB grid provides an example of how to do this).
As for initial conditions. You will need a few extra SOSE variables this time: UVEL
, VVEL
, ETAN
, SIuice
, SIvice
.
In ipython,
from mitgcm_python.ics_obcs import *
Now for each open boundary, run
sose_obcs(location, grid_file, sose_dir, output_dir, nc_out=nc_out, prec=prec)
where
-
location
:'N'
,'S'
,'E'
, or'W'
-
grid_file
: path to the NetCDF grid file you created for your new domain -
sose_dir
: directory containing your SOSE monthly climatologies (andgrid/
subdirectory) -
output_dir
: directory you would like the MITgcm OBCS files (binary) to be written into
with optional keyword arguments:
-
nc_out
: path to a NetCDF file which the boundary conditions will also be written into, so you can check them quickly in Ferret or ncview and make sure everything looks okay. -
prec
:32
or64
, matching the value ofexf_iprec_obcs
ininput/data.exf
. If this isn't set, it defaults to the value ofexf_iprec
(which subsequently defaults to32
). If you're not using EXF in your configuration, match the value ofreadBinaryPrec
ininput/data
instead.
For example, the WSB grid has open boundaries to the north and east:
sose_obcs('N', '../mitgcm/cases/WSB_001/input/grid.glob.nc', '../SOSE_monthly_climatology/', '../mitgcm/cases/WSB_001/input/', nc_out='obcs_n.nc', prec=32)
sose_obcs('E', '../mitgcm/cases/WSB_001/input/grid.glob.nc', '../SOSE_monthly_climatology/', '../mitgcm/cases/WSB_001/input/', nc_out='obcs_e.nc', prec=32)
Take a look at the NetCDF files to make sure everything looks okay.
The newly generated OBCS files will be named, for example, THETA_SOSE.OBCS_E
for temperature at the eastern boundary. Move them to input/
if they're not there already.
Make sure the line obcs
exists.
Switch on the ALLOW_OBCS_*
options required for your configuration. For example,
#define ALLOW_OBCS_NORTH
#define ALLOW_OBCS_EAST
Switch on ALLOW_OBCS_PRESCRIBE
so the files can be read.
You will likely want a sponge for stability, so switch on ALLOW_OBCS_SPONGE
to compile that code.
It's also a good idea to balance transport in and out of the domain, by automatically correcting the velocities at boundaries every timestep. Otherwise, your sea level can drift substantially. To compile that code, switch on ALLOW_OBCS_BALANCE
.
Sea ice OBCS can cause blowups due to disagreements in velocity leading to sea ice convergence. To prevent this, switch on OBCS_UVICE_OLD
plus one of the five OBCS_SEAICE_*
options listed below it. Experiment with them to see which is the most realistic in your domain.
If you actually had to make any of these changes to code/packages.conf
or code/OBCS_OPTIONS.h
(i.e. they weren't already there), you'll need to recompile the code.
These changes are made to section &PARM01
unless otherwise specified.
If you have an open eastern boundary, set OB_Ieast=Ny*-1
where Ny
matches the output of latlon_points
during grid generation. Similarly, set OB_Iwest=Ny*1
, OB_Jnorth=Nx*-1
, and OB_Jsouth=Nx*1
as needed.
Again for an open eastern boundary, set
OBEtFile='THETA_SOSE.OBCS_E'
OBEsFile='SALT_SOSE.OBCS_E'
OBEuFile='UVEL_SOSE.OBCS_E'
OBEvFile='VVEL_SOSE.OBCS_E'
OBEetaFile='ETAN_SOSE.OBCS_E'
OBEaFile='SIarea_SOSE.OBCS_E'
OBEhFile='SIheff_SOSE.OBCS_E'
OBEuiceFile='SIuice_SOSE.OBCS_E'
OBEviceFile='SIvice_SOSE.OBCS_E'
and similarly for the other directions as needed. If you have a linear free surface (nonlinFreeSurf=0
in input/data
) then skip the OBEeta
line.
Set useOBCSprescribe=.true.
If you do want a sponge, set useOBCSsponge=.true.
and modify the sponge parameters under &OBCS_PARM03
as needed. Note that Urelaxobcs*
and Vrelaxobcs*
correspond to east/west and north/south boundaries respectively, not to velocity variables! Set the relaxation timescale in seconds for the inside of the sponge (*relaxobcsinner
) and the outside of the sponge (*relaxobcsbound
), as well as the spongeThickness
in grid cells. I used 1 year, 1 week, and 8 grid cells (2° latitude in my domain) respectively.
If you do want to balance transport in and out of the domain, set useOBCSbalance=.true.
All changes are made to the section &EXF_NML_OBCS
.
For an open eastern boundary, set obcsEstartdate1
and siobEstartdate1
to the date code of the beginning of your simulation (eg 19790101
), obcsEstartdate2
and siobEstartdate2
to the time code of the beginning of your simulation (probably just 0
for midnight), and obcsEperiod
and siobEperiod
to -12.
(this indicates a monthly climatology). Similarly for the other directions as needed.
Make sure useOBCS=.true.
This section interpolates freshwater fluxes from iceberg melting to the model grid, so it can be used as the "runoff" field in the EXF atmospheric forcing package. Here we use output from a recent NEMO simulation with a coupled iceberg model (see Storkey et al. 2018), in a monthly climatology. Since this NEMO grid isn’t regular in lat-lon space, interpolation is necessary (otherwise EXF could interpolate for us during the simulation).
This step is entirely optional, and depending on your configuration you might be quite happy with another product, or with no runoff at all. Assuming you want to go ahead with it, here are the steps:
All you need to do is check with Pierre Mathiot that he’s okay with you using it, and then I can give you the files.
In ipython,
from mitgcm_python.forcing import *
iceberg_meltwater(grid_file, input_dir, output_file, nc_out=nc_out, prec=prec)
where
-
grid_file
: path to your MITgcm NetCDF grid file -
input_dir
: path to the directory containing the NEMO output -
output_file
: desired path to the binary file to write
with optional keyword arguments:
-
nc_out
: path to a NetCDF file to also write the interpolated fields into for checking -
prec
: matchesexf_iprec
in yourinput/data.exf
namelist (default32
).
For example,
iceberg_meltwater(‘../mitgcm/cases/WSB_001/input/grid.glob.nc’, ‘icebergs_NEMO_G07/’, ‘../mitgcm_cases/WSB_001/input/iceberg_melt_NEMO’, nc_out=’icebergs.nc’, prec=32)
Take a look at the NetCDF file to make sure nothing weird happened.
Copy the output_file you just created to input/
, if it wasn’t written directly in there.
Make sure ALLOW_RUNOFF
is defined. If you had to switch it on, you’ll also need to recompile the code.
Section &EXF_NML_02
:
- Set
runoffstartdate1
,runoffstartdate2
, andrunoffperiod
as you did for the monthly climatology OBCS before (eg19790101
,0
, and-12.
respectively). - Set
runofffile
to the name of the binary file you just created.
Section &EXF_NML_03
:
- Set
exf_inscal_runoff=1.E-3
to convert from kg/m2/s to m/s.
Section &EXF_NML_04
:
- Set
runoff_interpMethod=0
so the model doesn’t try to interpolate the input.