Skip to content

Creating a new MITgcm domain

Kaitlin Naughten edited this page Aug 8, 2018 · 80 revisions

This page describes how to create a new MITgcm domain and all its components:

The code creates polar spherical grids (i.e. regular in lat-lon space), with constant resolution scaled by cos of latitude. It is designed for realistic Antarctic domains including ice shelf cavities, which don’t do anything too weird with open boundary conditions. It has not been tested for domains with a periodic boundary (e.g. circumpolar domains).

The code is quite efficient; for a quarter-degree Weddell Sea domain (460 x 374 x 66 points), it took me about 30 minutes to run everything from start to finish. However, it might take you longer to make decisions about what you want, get a hold of datasets, etc.

Some instructions (eg file paths to datasets) are only relevant to BAS staff. Some instructions also assume you are running MITgcm with the workflow used by most people at BAS (e.g. all files in input/ are symlinked into run/).

Install the code

To build the domain, you will need to clone the mitgcm_python repository and ideally be able to access it from any directory on your machine.

First, choose a directory to install the code in. I'll use $HOME/python_code as an example here. On the command line, cd into this directory.

To download the code,

git clone https://github.com/knaughten/mitgcm_python.git

This will make a directory called mitgcm_python. Note that this repository relies on some python tools distributed with MITgcm, which will need to be in your PYTHONPATH. It is also useful to add the mitgcm_python code itself to your PYTHONPATH so you can access it from any directory. So add this line to the bottom of your ~/.bashrc:

export PYTHONPATH=$PYTHONPATH:$HOME/python_code:$ROOTDIR/utils/python/MITgcmutils

where $ROOTDIR is the path to your copy of the MITgcm source code distribution.

Then either log out and log in again, or type source ~/.bashrc. To make sure it worked, type python or even better ipython, and try these commands:

import mitgcm_python
import MITgcmutils

If there are no error messages, everything imported successfully.

Build the grid

For this section, you only need one module from the repository. Every time you launch ipython (or python) during this section, type this line:

from mitgcm_python.make_domain import *

Generate lat-lon points

In ipython,

lon, lat = latlon_points(west_boundary, east_boundary, south_boundary, north_boundary, resolution, delY_filename)

For example, a grid from 85°W to 30°E, 84.2°S to 61°S, with 0.25° resolution:

lon, lat = latlon_points(-85, 30, -84.2, -61, 0.25, 'delY_WSB')

The script will print the values of Nx, Ny, and their factors. You want Nx and Ny to have some factors in the 15-30 range, as these tend to be the most efficient tile sizes. If Nx and/or Ny don't have these factors, tweak your boundaries and try again until you're happy. Here is a nice looking result:

Nx = 460 which has the factors [1, 2, 4, 5, 10, 20, 23, 46, 92, 115, 230, 460]
Ny = 374 which has the factors [1, 2, 11, 17, 22, 34, 187, 374]

Make a note of the values of xgOrigin, ygOrigin, and dxSpacing, which are also printed by the script, as you will need them later. Note that these variables come from the input arguments to latlon_points: xgOrigin is your western boundary mod 360 so it's in the range 0-360, ygOrigin is your southern boundary, and dxSpacing is your resolution.

Interpolate BEDMAP2 topography

Without exiting ipython (so the variables lon and lat are still in memory),

interp_bedmap2(lon, lat, topo_dir, nc_outfile, bed_file=bed_file)

where

  • topo_dir: path to directory containing the necessary BEDMAP2 binary files. Look at the function definition in make_domain.py to figure out what filenames you need; everything is on Scihub in the directory /data/oceans_input/raw_input_data/bedmap2/bedmap2_bin/. If your domain goes north of 60°S, you will also need the GEBCO bathymetry file in this directory (/data/oceans_input/raw_input_data/GEBCO/GEBCO_2014_2D.nc on Scihub).
  • nc_outfile: filename for a NetCDF file that the script will create and save the temporary fields in.
  • bed_file: optional keyword argument. If you have an alternate BEDMAP2 bathymetry file (say with some updates by Sebastian Rosier in the Filchner), set bed_file to its file path. Otherwise, leave it out and the code will use the original BEDMAP2 file within topo_dir.

For example,

interp_bedmap2(lon, lat, '../topo_data/', 'WSB.nc', bed_file='bedmap2_bed_seb.flt')

Edit land mask

You might want to edit the land mask of your new domain. For example, Weddell Sea domains will typically fill everything west of the Antarctic Peninsula with land, and possibly extend the peninsula northwards, so the western boundary of the domain is closed. It can also be helpful to block out little islands and single-cell channels near the peninsula.

If you need to do something like this, take a look at the function edit_mask in make_domain.py. Create a new value for key (an identifier string for your domain, such as 'WSB') and code up a new case for the middle section of the function. It will help to look at the NetCDF file output by interp_bedmap2 in something like ferret or ncview, and figure out exactly which sections you need to fill with land.

There are a few utility functions you can use, all stored in utils.py:

  • mask_box: fill a box, defined by lat and lon boundaries, with land
  • mask_iceshelf_box: only fill ice shelf points in the given box with land
  • mask_above_line: fill everything north of a line segment, defined by two points, with land
  • mask_below_line: fill everything south with land

When you're ready, run the script in ipython:

edit_mask(nc_infile, nc_outfile, key=key)

For example,

edit_mask('WSB.nc', 'WSB_edited.nc', key='WSB')

Choose vertical layer thicknesses

When you ran interp_bedmap2, it will have printed the deepest bathymetry in your domain. If you ran edit_mask, it will have also printed this value, in case it changed. Make sure that the sum of your vertical layer thicknesses is slightly greater than this value. Put them in a plain text file, one value per line (positive, in metres), with no commas.

Filling, digging, and zapping

Now we need to address three problems which can arise with z-levels. First, isolated bottom cells which have no horizontal neighbours (i.e. bathymetry points which are in a deeper z-layer than all their neighbours) should be raised so they have at least one horizontal neighbour, otherwise dense water can pool there due to lack of circulation. Second, it is sometimes necessary to dig (artificially deepen) the bathymetry in regions with very thin water columns, so that all ocean cells are properly connected in accordance with the BEDMAP2 mask, and the grounding line locations are preserved. Otherwise, subglacial lakes can form with sections of the domain being artificially disconnected. Third, very thin ice shelf drafts should be removed, otherwise this triggers a nasty masking bug in MITgcm (as of July 2018 it is about to be fixed in the master code). To complete all three of these steps, run:

remove_grid_problems(nc_infile, nc_outfile, dz_file, hFacMin=hFacMin, hFacMinDr=hFacMinDr)

where

  • nc_infile: NetCDF grid file after you edited the mask
  • dz_file: plain text file with your vertical layer thicknesses
  • hFacMin and hFacMinDr: agree with the values in your input/data namelist to MITgcm For example,
remove_grid_problems('WSB_edited.nc', 'WSB_final.nc', 'dz_vals', hFacMin=0.1, hFacMinDr=20.)

Write to binary

Now it's time to put the bathymetry and ice shelf draft fields in the binary format for reading into MITgcm:

write_topo_files(nc_file, bathy_file, draft_file)

For example,

write_topo_files('WSB_final.nc', 'bathy_WSB', 'draft_WSB')

Edit MITgcm configuration files

input/data

Section &PARM01:

  • Make sure that readBinaryPrec=64, because the three binary files we just wrote are 64-bit not 32-bit.
  • If you're also using xmitgcm to convert your binary output to NetCDF, temporarily change debugLevel to 1 or higher so you can generate a new available_diagnostics.log reflecting the new number of vertical layers. Delete the file input/available_diagnostics.log so the symlink doesn't mess things up.

Section &PARM04:

  • Update xgOrigin, ygOrigin, and dxSpacing based on the values printed by latlon_points.
  • Set delYfile to the file output by latlon_points (eg 'delY_WSB').
  • delR should be a list of your vertical layer thicknesses.

Section &PARM05:

  • Set bathyFile to the bathymetry file output by write_topo_files (eg 'bathy_WSB').

You'll probably also have to update the number of vertical layers in a few variables which have the form (number of layers)*constant. For my configuration I needed to change tRef and sRef (both in section &PARM01), but your configuration might have other such variables, or maybe even none.

input/data.shelfice

Set SHELFICEtopoFile to the ice shelf draft file output by write_topo_files (eg 'draft_WSB'). Set SHELFICEloadAnomalyFile to some temporary filename (eg 'pload_tmp') - more on this later.

input/data.pkg

If useMNC=.true., set it to .false.. False is the default so if there is no line about MNC at all, you're fine.

input/data.obcs

In section &OBCS_PARM01, update the values of OB_Iwest, OB_Ieast, OB_Jsouth, and OB_Jnorth as needed to reflect the new values of Ny (for OB_Iwest and OB_Ieast) and Nx (for OB_Jsouth and OB_Jnorth). They will each be in the form (Nx or Ny)*(constant such as 1 or -1).

code/SIZE.h

Update the values of Nr (number of vertical layers), sNx, sNy (tile dimensions) and nPx, nPy (number of tiles in each dimension) such that sNx*nPx=Nx, sNy*nPy=Ny, and sNx, sNy are both around 15-30. Later, you can do proper benchmarking to figure out what is the optimum tile size, but it will likely be around this range.

Your total number of processors is nPx*nPy; make sure your aprun commands and PBS job scripts reflect this.

Other changes

Copy the files pointed to by delYfile, bathyFile, and SHELFICEtopoFile into the input/ directory. Within this directory, make a copy of SHELFICEtopoFile to the temporary filename pointed to by SHELFICEloadAnomalyFile.

Finally, recompile the model to apply the changes in SIZE.h.

Make MITgcm grid files

Run the model just long enough to output binary grid files, which have useful derived variables such as partial cell fractions. We will need these files later, for things like interpolating initial conditions. Note that before these grid files are written, each package within MITgcm does some consistency checks. We need to fool the model to get past this stage. This is the purpose of the temporary SHELFICEloadAnomalyFile, just so MITgcm doesn't die because no such file exists or has the wrong dimensions. You might also have to switch some packages off depending on your configuration and MITgcm version. I had to switch off RBCS.

The model will die at some point during initialisation, probably when it can't find the initial conditions. But if you've fooled it enough, it will have written the following files into the run/ directory:

Depth.data  DRF.meta  hFacC.data  hFacW.meta  RF.data  XG.meta
Depth.meta  DXG.data  hFacC.meta  RAC.data    RF.meta  YC.data
DRC.data    DXG.meta  hFacS.data  RAC.meta    XC.data  YC.meta
DRC.meta    DYG.data  hFacS.meta  RC.data     XC.meta  YG.data
DRF.data    DYG.meta  hFacW.data  RC.meta     XG.data  YG.meta

Copy these files into their own special directory outside of run/ (I like to use grid/ inside the case directory, i.e. on the same level as run/).

Check the digging and filling

Make sure the digging and filling worked, according to the partial cells generated by MITgcm. In ipython,

check_final_grid(grid_path)

where grid_path is the path to the directory with the binary grid files you created in the previous step. For example,

check_final_grid('../grid/')

If everything looks good, the code will print a message saying so.

Assuming this is the case, you're done making the grid! Now undo any weird changes you made to the model:

  • Delete the temporary SHELFICEloadAnomalyFile (to make sure you don't accidentally use it later!)
  • Switch on any packages you had to switch off (such as RBCS)
  • If you're using xmitgcm, copy available_diagnostics.log from run/ into input/, and in input/data put debugLevel back to its original value.

Interpolate initial conditions

This section interpolates initial conditions for temperature, salinity, sea ice area, and sea ice thickness from the SOSE Southern Ocean reanalysis. This may or may not be the dataset you want for your configuration. If not, hopefully this code will be a good starting point which you can modify to use with another product. Here we use the January climatology for all fields, as it's probably a safe assumption that your simulations start on 1 January.

SOSE is quite good in the open Southern Ocean, but can have some artifacts on the Antarctic continental shelf due to lack of data, and in particular it has no ice shelf cavities. The strategy used by this code is to set constant values for temperature and salinity in ice shelf cavities. Good values, depending on your region, might be -1.9°C (warm enough to cause some ice shelf melting so the cavities flush out quickly, but not ridiculously warm) and 34.4 psu (light enough to prevent any pooling of dense water).

MITgcm also needs the initial pressure loading anomaly in ice shelf cavities. This is equal to the weight of water displaced by the ice shelf, with respect to the reference density. This code assumes that the water displaced would have the same constant temperature and salinity as the initial conditions in the cavities. Note that this pressure loading is not the same as the actual weight of the ice shelf, because the observed ice shelf mass is in dynamic equilibrium with the real conditions under the cavities (i.e. non-constant temperature and salinity which are probably colder and denser than the values you set, and nonzero velocities). If you try to use ice shelf mass for the pressure loading anomaly in a simulation which is not spun up, it will probably blow up.

In this section, whenever you open ipython, load the correct functions like this:

from mitgcm_python.ics_obcs import *

Get the data

Get a hold of the SOSE monthly climatologies for temperature (THETA), salinity (SALT), sea ice area (SIarea), and sea ice thickness (SIheff). I have pre-computed these from the full monthly fields (using the function make_sose_climatology in ics_obcs.py, in case you want to add a field) and saved them on Scihub, in the directory /data/oceans_input/raw_input_data/SOSE_monthly_climatology. Just copy the entire directory to whichever server or machine you're using.

If you recompute the climatologies yourself for whatever reason, make sure the filenames match (THETA_climatology.data, etc.) or else change the filename patterns in the function sose_ics (in ics_obcs.py).

Also make sure you have the subdirectory grid/ which includes the binary grid files for SOSE. (The Matlab grid files in the official SOSE distribution can't be read into python.)

Interpolate

In ipython,

sose_ics(grid_path, sose_dir, output_dir, nc_out=nc_out, constant_t=constant_t, constant_s=constant_s, split=split)

where

  • grid_path: path to directory containing grid files
  • sose_dir: directory containing your SOSE monthly climatologies (and grid/ subdirectory)
  • output_dir: directory you would like the MITgcm initial conditions files (binary) to be written into

with optional keyword arguments:

  • nc_out: path to a NetCDF file which the initial conditions will also be written into, so you can check them quickly in Ferret or ncview and make sure everything looks okay.
  • constant_t, constant_s: temperature (°C) and salinity (psu) to fill ice shelf cavities with. Defaults are -1.9 and 34.4.
  • split: 180 or 0, indicating the longitude you would like to split the SOSE grid at. If your domain contains 0°E, set split=180 (default). If your domain contains 180°E, set split=0. If you have a circumpolar domain, try either and hope for the best - it's possible you will have points in the gap between SOSE's periodic boundary and you will have to write a patch to wrap the SOSE data around that boundary.

For example,

sose_ics('../mitgcm/cases/WSB_001/grid/', '../SOSE_monthly_climatology/', '../mitgcm/cases/WSB_001/input/', nc_out='sose_ics.nc', constant_t=-1.9, constant_s=34.4, split=180)

Take a look at the NetCDF file to make sure nothing crazy happened.

Calculate pressure load anomaly

calc_load_anomaly(grid_path, out_file, constant_t=constant_t, constant_s=constant_s, rhoConst=rhoConst)

where

  • out_file: path to your desired binary output file
  • rhoConst: matches the value in data/input

For example,

calc_load_anomaly('../mitgcm/cases/WSB_001/grid/', '../mitgcm/cases/WSB_001/input/pload_WSB', constant_t=-1.9, constant_s=34.4, rhoConst=1035)

Edit MITgcm configuration files

The SOSE initial conditions files will be called THETA_SOSE.ini, etc. Rename these if you like, and move them into your input/ folder if they weren't written directly in there. Also move the pressure loading anomaly file to input/.

input/data

Section &PARM05:

  • Set hydrogThetaFile='THETA_SOSE.ini' and hydrogSaltFile='SALT_SOSE.ini' (or whatever you renamed them to).

Section &PARM01:

  • Make sure that eosType='MDJWF'.

input/data.seaice

Set AreaFile='SIarea_SOSE.ini' and HeffFile='SIheff_SOSE.ini'.

input/data.shelfice

Set SHELFICEloadAnomalyFile to the filename output by calc_load_anomaly.

Interpolate boundary conditions

This section interpolates open boundary conditions (OBCS) from the SOSE monthly climatology, for the following variables: temperature, salinity, horizontal ocean velocities, sea surface height, sea ice area and thickness, sea ice velocities. As in the last section, if this isn't the right dataset for your needs, the code will hopefully be a useful starting point.

Your domain can have up to four open boundaries (north, south, east, west); in many cases it will just have one or two, as the other boundaries are closed (completely land). Make sure that none of the open boundaries include ice shelf cavities - you might have to sacrifice a small ice shelf cavity and fill it with land during the edit_mask step of grid generation (the WSB grid provides an example of how to do this).

Get the data

As for initial conditions. You will need a few extra SOSE variables this time: UVEL, VVEL, ETAN, SIuice, SIvice.

Interpolate

In ipython,

from mitgcm_python.ics_obcs import *

Now for each open boundary, run

sose_obcs(location, grid_path, sose_dir, output_dir, nc_out=nc_out, prec=prec)

where

  • location: 'N', 'S', 'E', or 'W'
  • grid_path: path to directory containing grid files
  • sose_dir: directory containing your SOSE monthly climatologies (and grid/ subdirectory)
  • output_dir: directory you would like the MITgcm OBCS files (binary) to be written into

with optional keyword arguments:

  • nc_out: path to a NetCDF file which the boundary conditions will also be written into, so you can check them quickly in Ferret or ncview and make sure everything looks okay.
  • prec: 32 or 64, matching the value of exf_iprec_obcs in input/data.exf. If this isn't set, it defaults to the value of exf_iprec (which subsequently defaults to 32). If you're not using EXF in your configuration, match the value of readBinaryPrec in input/data instead.

For example, the WSB grid has open boundaries to the north and east:

sose_obcs('N', '../mitgcm/cases/WSB_001/grid/', '../SOSE_monthly_climatology/', '../mitgcm/cases/WSB_001/input/', nc_out='obcs_n.nc', prec=32)
sose_obcs('E', '../mitgcm/cases/WSB_001/grid/', '../SOSE_monthly_climatology/', '../mitgcm/cases/WSB_001/input/', nc_out='obcs_e.nc', prec=32)

Take a look at the NetCDF files to make sure everything looks okay.

Correct velocities to balance net transport

These open boundary conditions will almost definitely not conserve volume (due to differences in the land masks between your domain and SOSE's, as well as interpolation artifacts). If you do nothing, the sea level will probably drift like crazy, and if you are using a nonlinear free surface the model will eventually blow up. Therefore, it's necessary to add or subtract a very small number from the normal velocity at each open boundary, so the net transport into and out of the domain balances.

This will not work perfectly. The actual transport through each boundary is affected by the sea surface height, and the OBCS sponge (which is usually necessary to prevent blowups) means that velocity at the boundaries doesn't have to match the prescribed fields exactly. The total volume of the domain is also affected by freshwater fluxes, such as precipitation, evaporation, and melting/freezing of sea ice and ice shelves. However, balancing the OBCS transport is a good way to get rid of the first-order drift. If you still notice a significant drift in sea level during the simulations, you can alter the OBCS normal velocities again to counteract this drift (more on that later).

But right now, there are two options. One is to tell MITgcm to balance the transport through the open boundaries at every timestep (the next section describes how to do this). However, this approach is not appropriate if your simulation will be using tides. In this case, you need to calculate the correction offline so the transport balances in the long term, but the tides are still allowed to slosh everything around in the short term. To do this, run the following script in ipython:

balance_obcs(grid_path, obcs_file_w_u=obcs_file_w_u, obcs_file_e_u=obcs_file_e_u, obcs_file_s_v=obcs_file_s_v, obcs_file_n_v=obcs_file_n_v, prec=prec)

where

  • grid_path: path to directory containing grid files
  • obcs_file_w_u, obcs_file_e_u, obcs_file_s_v, obcs_file_n_v: path to files output by sose_obcs, for UVEL at the western and eastern boundaries and VVEL at the southern and northern boundaries respectively. You only need to set the files where open boundaries exist; leave the others out.
  • prec: matches what you told sose_obcs

For example,

balance_obcs('../mitgcm/cases/WSB_001/grid/', obcs_file_e_u='../mitgcm/cases/WSB_001/input/UVEL_SOSE.OBCS_E', obcs_file_n_v='../mitgcm/cases/WSB_001/input/VVEL_SOSE.OBCS_N', prec=32)

The OBCS files will be overwritten with corrected fields.

When you start running the model in anger, if the sea level still drifts substantially (likely if you are in a region of net sea ice export and have useRealFreshwaterFlux=.true.), you can correct this using the same function. Figure out how much the area-averaged sea surface height changed during your simulation (the code in timeseries.py will be helpful here). Then call the function with some extra arguments:

balance_obcs(grid_path, obcs_file_w_u=obcs_file_w_u, obcs_file_e_u=obcs_file_e_u, obcs_file_s_v=obcs_file_s_v, obcs_file_n_v=obcs_file_n_v, prec=prec, option='correct', d_eta=d_eta, d_t=d_t)

where d_eta is the change in area-averaged sea surface height in metres, and d_t is the length of your test simulation in years. For example,

balance_obcs('../mitgcm/cases/WSB_001/grid/', obcs_file_e_u='../mitgcm/cases/WSB_001/input/UVEL_SOSE.OBCS_E', obcs_file_n_v='../mitgcm/cases/WSB_001/input/VVEL_SOSE.OBCS_N', prec=32, option='correct', d_eta=-7, d_t=6)

Edit MITgcm configuration files

The newly generated OBCS files will be named, for example, THETA_SOSE.OBCS_E for temperature at the eastern boundary. Move them to input/ if they're not there already.

code/packages.conf

Make sure the line obcs exists.

code/OBCS_OPTIONS.h

Switch on the ALLOW_OBCS_* options required for your configuration. For example,

#define ALLOW_OBCS_NORTH
#define ALLOW_OBCS_EAST

Switch on ALLOW_OBCS_PRESCRIBE so the files can be read.

You will likely want a sponge for stability, so switch on ALLOW_OBCS_SPONGE to compile that code.

If you do want to balance OBCS transport online every timestep (i.e. you didn't do the balance_obcs step), switch on ALLOW_OBCS_BALANCE.

Sea ice OBCS can cause blowups due to disagreements in velocity leading to sea ice convergence. To prevent this, switch on OBCS_UVICE_OLD plus one of the five OBCS_SEAICE_* options listed below it. Experiment with them to see which is the most realistic in your domain. I have had the most luck with OBCS_SEAICE_SMOOTH_UVICE_PERP.

If you actually had to make any of these changes to code/packages.conf or code/OBCS_OPTIONS.h (i.e. they weren't already there), you'll need to recompile the code.

input/data.obcs

These changes are made to section &PARM01 unless otherwise specified.

If you have an open eastern boundary, set OB_Ieast=Ny*-1 where Ny matches the output of latlon_points during grid generation. Similarly, set OB_Iwest=Ny*1, OB_Jnorth=Nx*-1, and OB_Jsouth=Nx*1 as needed.

Again for an open eastern boundary, set

OBEtFile='THETA_SOSE.OBCS_E'
OBEsFile='SALT_SOSE.OBCS_E'
OBEuFile='UVEL_SOSE.OBCS_E'
OBEvFile='VVEL_SOSE.OBCS_E'
OBEetaFile='ETAN_SOSE.OBCS_E'
OBEaFile='SIarea_SOSE.OBCS_E'
OBEhFile='SIheff_SOSE.OBCS_E'
OBEuiceFile='SIuice_SOSE.OBCS_E'
OBEviceFile='SIvice_SOSE.OBCS_E'

and similarly for the other directions as needed. If you have a linear free surface (nonlinFreeSurf=0 in input/data) then skip the OBEeta line.

Set useOBCSprescribe=.true.

If you do want a sponge, set useOBCSsponge=.true. and modify the sponge parameters under &OBCS_PARM03 as needed. Note that Urelaxobcs* and Vrelaxobcs* correspond to east/west and north/south boundaries respectively, not to velocity variables! Set the relaxation timescale in seconds for the inside of the sponge (*relaxobcsinner) and the outside of the sponge (*relaxobcsbound), as well as the spongeThickness in grid cells. I used 1 year, 1 week, and 8 grid cells (2° latitude in my domain) respectively.

If you do want to balance OBCS transport online every timestep (i.e. you didn't do the balance_obcs step), set useOBCSbalance=.true.

input/data.exf

All changes are made to the section &EXF_NML_OBCS.

For an open eastern boundary, set obcsEstartdate1 and siobEstartdate1 to the date code of the beginning of your simulation (eg 19790101), obcsEstartdate2 and siobEstartdate2 to the time code of the beginning of your simulation (probably just 0 for midnight), and obcsEperiod and siobEperiod to -12. (this indicates a monthly climatology). Similarly for the other directions as needed.

input/data.pkg

Make sure useOBCS=.true.

Interpolate surface salinity for restoring

This section interpolates surface salinity from SOSE to the model grid, and sets up a mask for surface salinity restoring using RBCS in a monthly climatology. It does not restore on the continental shelf (defined by regions where the bathymetry is shallower than some threshold), because we don't trust SOSE very much on the continental shelf, and don't want to interfere with convection here. The boundary between the restoring and no-restoring regions is smoothed (i.e. the restoring mask is somewhere between 0 and 1) so no sharp gradients in surface salinity result.

Surface salinity restoring is entirely optional, but if you would like to do, here are the steps:

Get the data

As for initial conditions. The only variable you need this time is SALT.

Interpolate

In ipython,

from mitgcm_python.forcing import *
sose_sss_restoring(grid_path, sose_dir, output_salt_file, output_mask_file, nc_out=nc_out, h0=h0, obcs_sponge=obcs_sponge, split=split)

where

  • grid_path: path to directory containing grid files
  • sose_dir: directory containing your SOSE monthly climatologies (and grid/ subdirectory)
  • output_salt_file: path to your desired RBCS salinity file
  • output_mask_file: path to your desired RBCS mask file

with optional keyword arguments:

  • nc_out: path to a NetCDF file to also write the salinity and mask into for checking
  • h0: threshold bathymetry to define the continental shelf; default is -1250 which works well in the Weddell Sea
  • obcs_sponge: if you are using an OBCS sponge, the width of that sponge in grid cells (equal to spongeThickness in input/data.obcs)
  • split: 180 or 0, indicating the longitude you would like to split the SOSE grid at. If your domain contains 0°E, set split=180 (default). If your domain contains 180°E, set split=0. If you have a circumpolar domain, try either and hope for the best - it's possible you will have points in the gap between SOSE's periodic boundary and you will have to write a patch to wrap the SOSE data around that boundary.

There are also options to add artificial polynyas by restoring to higher surface salinities; you probably don't care about this, but if you do, read the code (function sose_sss_restoring in forcing.py).

For example,

sose_sss_restoring('../mitgcm/cases/WSB_001/grid/', '../SOSE_monthly_climatology/', '../mitgcm/cases/WSB_001/input/SALT.RBCS', '../mitgcm/cases/WSB_001/input/MASK.RBCS', nc_out='sss_restoring.nc', h0=-1250, obcs_sponge=8, split=180)

Take a look at the NetCDF file to make sure nothing weird happened.

Edit MITgcm configuration files

Copy the output_salt_file and output_mask_file to input/, if they weren't written directly in there.

code/packages.conf

Make sure the line rbcs exists. If you had to add it, you'll also have to recompile the code.

input/data.pkg

Set useRBCS=.true.

input/data.rbcs

These changes are all made to section &RBCS_PARM01.

Set useRBCsalt=.true. to switch on salinity restoring.

Set relaxSFile and relaxMaskFile(2) to the filenames of your salinity and mask files respectively.

Set rbcsForcingPeriod=-12 to indicate a monthly climatology.

Finally, set tauRelaxS to the restoring timescale in seconds. I like to use 1 year for very weak restoring. You can also use 30 days for stronger restoring.

Interpolate iceberg meltwater

This section interpolates freshwater fluxes from iceberg melting to the model grid, so it can be used as the "runoff" field in the EXF atmospheric forcing package. Here we use output from a recent NEMO simulation with a coupled iceberg model (see Storkey et al. 2018), in a monthly climatology. Since this NEMO grid isn’t regular in lat-lon space, interpolation is necessary (otherwise EXF could interpolate for us during the simulation).

This step is entirely optional, and depending on your configuration you might be quite happy with another product, or with no runoff at all. Assuming you want to go ahead with it, here are the steps:

Get the NEMO monthly climatology

All you need to do is check with Pierre Mathiot that he’s okay with you using it, and then I can give you the files.

Interpolate

In ipython,

from mitgcm_python.forcing import *
iceberg_meltwater(grid_path, input_dir, output_file, nc_out=nc_out, prec=prec)

where

  • grid_path: path to directory containing grid files
  • input_dir: path to the directory containing the NEMO output
  • output_file: desired path to the binary file to write

with optional keyword arguments:

  • nc_out: path to a NetCDF file to also write the interpolated fields into for checking
  • prec: matches exf_iprec in your input/data.exf namelist (default 32).

For example,

iceberg_meltwater(‘../mitgcm/cases/WSB_001/grid/’, ‘icebergs_NEMO_G07/’, ‘../mitgcm/cases/WSB_001/input/iceberg_melt_NEMO’, nc_out=’icebergs.nc’, prec=32)

Take a look at the NetCDF file to make sure nothing weird happened.

Edit MITgcm configuration files

Copy the output_file you just created to input/, if it wasn’t written directly in there.

code/EXF_OPTIONS.h

Make sure ALLOW_RUNOFF is defined. If you had to switch it on, you’ll also need to recompile the code.

data.exf

Section &EXF_NML_02:

  • Set runoffstartdate1, runoffstartdate2, and runoffperiod as you did for the monthly climatology OBCS before (eg 19790101, 0, and -12. respectively).
  • Set runofffile to the name of the binary file you just created.

Section &EXF_NML_03:

  • Set exf_inscal_runoff=1.E-3 to convert from kg/m2/s to m/s.

Section &EXF_NML_04:

  • Set runoff_interpMethod=0 so the model doesn’t try to interpolate the input.
Clone this wiki locally