-
Notifications
You must be signed in to change notification settings - Fork 18
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge pull request #2370 from GEOS-ESM/develop
Gitflow: Merge Develop into Main
- Loading branch information
Showing
286 changed files
with
11,053 additions
and
4,671 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,42 @@ | ||
# Overview | ||
Regrid_Util.x is a utility program built with MAPL. It has the ability to regrid files created by the MAPL IO layer (newCFIO+pFIO) that is used by the MAPL History and ExtData Component to any lat-lon, cubed-sphere, or a tripolar grid. The utility can therefore regrid between any two supported grid types, and the input and output grids can be of the **same type** (but different resolution). Behind the scene it uses ESMF to perform the regridding. It is entirely driven by command line options. The user specifies an output grid and the input grid type and specifications are determined from the input file. | ||
|
||
Note this should be able to regrid a file produced by the History component and the resultant file should be readable by ExtData. Note that the Lat-Lon files produced by History match the CF convention, so if you have your own lat-lon files that were produced by some other means, if they adhere to the CF standard for lons,lats,level, and time information, then they will probably work if you give them as input. If you want to be extra safe run GEOS and get some lat-lon History output and make sure your own files looks like this! | ||
|
||
_**Also note, due to some misunderstandings as IO layers evolved, some metadata that is in files produced by History before MAPL 2.0 may not be preserved when running this code with version starting at 2.0+. We realized this and fixed this. You will need MAPL v2.18.0 in order to ensure that files get written with all the metadata that was in the pre MAPL 2.0 History output. Also note this missing metadata was global attributes, duplicate definitions of the missing value that are part of no standard, and extra time attributes that duplicate the information contained in the time units and time variable. In other words if your code was relying on these, you probably need to rewrite as it was badly written and non-standard conforming in the first place.**_ | ||
|
||
# Running the code | ||
**By default it runs on 6 processors so run the code with mpirun -np 6, if you are not running on 6 you must adjust adjust nx and ny as specified later; also note that this does mean you must be able to run MPI on the system, so if you are on the login node of many computer centers, you cannot run it there. Finally if the input or output is on a cubed-sphere grid you must run with at least 6.** | ||
|
||
If both the input and output are lat-lon or tripolar, it can be run on a single processor. In that case you must explicitly pass nx/ny to the program to override the defaults. **Also note if you specify NX and NY, NX*NY must equals the number of cores used in the mpirun command.** This could be useful is you need higher speed or are regridding large input or output grids. Finally note that NY must be divisible by 6 if the input or output is on the cube. | ||
|
||
The minimum arguments are -i, -o, -ogrid and if you use only use these 3 the program must be run with 6 mpi tasks | ||
|
||
# Command line options | ||
Note options have been added as the program has grown beyond the initial command line options. Additional options that have been added or potentially enhanced and the version they were made available will be noted. | ||
* -i input file (at **v2.11.0** and above you can give a list of comma separated files (no spaces!) to regrid. These should all be on the same grid and have the same variables) | ||
* -o output file (at **v2.11.0** and above if specifying multiple input files then correspondingly specify multiple output files, comma separated, no spaces) | ||
* -ogrid encoding of the output grid name, see section on grid names | ||
* -nx x decomposition to use for the decomposition of the target grid | ||
* -ny y decomposition to use for the decomposition of the target grid | ||
* -t date and time to select if the file contains multiple time slices (for example 20000415 210000) | ||
* -method regridding method, the available options are defined here and follow what one can specify in History: https://github.com/GEOS-ESM/MAPL/wiki/Regridding-Methods-Available-In-MAPL#specifying-regridding-methods-in-extdata-and-history-in-mapl-v2220-and-greater. These correspond to the underlying ESMF regridding methods. For more information about the ESMF regridding methods see this document: https://earthsystemmodeling.org/docs/release/latest/ESMF_refdoc/node3.html#SECTION03023000000000000000 | ||
* -vars specify a comma separated (no spaces!) list of variables to regrid that subset of the variables from the input file only | ||
* -tp_in tripolar file for input grid is the input file is on a tripolar gird | ||
* -tp_out tripolar file for output grid if output grid is tripolar | ||
* -lon_range from **v2.9.0** if the output grid is lat-lon specify make grid region on the lon direction by specifying a comma separate (no space!) min and max longitude in degrees | ||
* -lon_range from **v2.9.0** if the output grid is lat-lon specify make grid region on the lat direction by specifying a comma separated min and max longitude in degrees | ||
* -deflate from **v2.3.2** apply compression to the output file values can be 1 to 9 corresponding to the netcdf deflation level. Default is not compression. | ||
* -shave from **v2.3.2** bit shave the output by specifying the number of bits in the mantissa to retain (a floating point single precision number has 23 bits). Default no bit shaving | ||
|
||
# Grid Names | ||
The grid name used in ogrid follows the following conventions: | ||
* For lat-lon grid it will be of the form PLEim_worldxjm_world-DATELINE (i.e. PC360x181-DC). In the case of a global lat-lon grid pole is either PC or PE (pole centered or pole edge) and dateline is DE,DC,GC,GE (dateline edge, dateline center, Grenwich center, Grenwich edge). IM_WORLD and JM_WORLD are the number of grid points in the lon and lat direction. From **v2.9.0** onward a regional lat-lon grid can be specified with the -lat_range and -lon_range option. Note you can specify 1 of these or both. Which ever one you specify, set the POLE or DATELINE (or both!) to XY. So a if you want a 180x90 regional grid from 0 to 90 in longitude and -30 to 30 in latitude use these arguments -lat_range -30,30 -lon_range 0,90 -ogrid XY180x90-XY | ||
* For cubed sphere the name will look like PEcube_sizexcubesize*6-CF (i.e. PE180x1080-CF for a c180 cubed sphere grid) | ||
* For tripolar it will look like PE720x410-TM, however you must supply an file containing the tripolar grid coordinates in the correct form | ||
|
||
# Other Notes | ||
* The regridding layer we have built on top of ESMF that this code uses assumes the undefined value is that used by MAPL which is 1.0e15 (MAPL_UNDEF constant in the code). Input points that are MAPL_UNDEF do not contribute and any weights involving these points are renormalized. In other words, another undefined value is not supported. | ||
* If you do want to use more than 6 cores and specify an NX and NY note if going to a cubed-sphere output grid NY must be divisible by six (the cubed is decomposed such that each face has NX*NY/6 points). | ||
* This code in no way shape or form supports regridding of the vertical coordinate nor will it for the foreseeable future without extensive development of the MAPL library which is currently not being pursued! | ||
* If using the bit shaving option be careful, this helps with compression but you are throwing away information. If you have fields that varies in the last bits of the mantissa with the same exponent you lose that variation. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,92 @@ | ||
#!/usr/bin/env python3 | ||
|
||
import argparse | ||
from tile_formatter import create_tile_from_map | ||
from ll_formatter import create_latlon_from_map | ||
from cs_formatter import create_cube_from_map | ||
import yaml | ||
import textwrap | ||
|
||
def parse_args(): | ||
program_description = textwrap.dedent(f''' | ||
USAGE: | ||
This script will take a series of MAPL binary forcing files and convert to NetCDF. | ||
The current binary forcing files consist of 2 records for each time. So for N times the Fortran file has 2N records. | ||
The first record is a real array of size 14 that encodes the valid range of the data for that record | ||
THe second record is the binary data array | ||
Currently this script takes a single yaml file as the first argument and can take optional arguement if you want verbose output to see the name of each file is processes. | ||
The YAML files consists of a dictionary whose values are the file path you want to convert. | ||
The values of the dictionary are the various keys that describe what is in the file and the ouput. | ||
For each entry in the map, you must at least supply the following keys: | ||
output_file: name of output file | ||
var: name of the variable in the output file | ||
grid_type: what type of grid the input is on, options, 'latlon','cube','tile' | ||
In all of them the units and long_name are optional, if not passed will be set to NA | ||
There are two other optional keys, if neither is passed every record will be written | ||
The reason for using these is that ExtData does not need this "padding" and is in fact detrimental | ||
clim: default False, if True assumes we have a 14 month climatology and does not write first or last value | ||
year: default None, if passed must be integer year and only records for that year are written | ||
Finally as you can see in the examples, if lat-lon or cube you must tell what is the resolution via extra keys since the input files are not self descripting | ||
Also note that for the cube, you must provide a file with the coordinates since those can't be easily computed unlike the lat-lon grid | ||
Here is an example input demonstrating all 3 types than can be handled | ||
CF0048x6C/lai_clim_48x288.data: | ||
output_file: lai_clim_48x244.nc4 | ||
clim: True | ||
var: lai | ||
units: 1 | ||
long_name: leaf_area_index | ||
grid_type: tile | ||
dataoceanfile_OSTIA_REYNOLDS_SST.2880x1440.2015.data: | ||
output_file: dataoceanfile_OSTIA_REYNOLDS_SST.2880x1440.2015.nc4 | ||
im: 2880 | ||
jm: 1440 | ||
var: sst | ||
units: K | ||
long_name: sea_surface_temperature | ||
year: 2015 | ||
grid_type: latlon | ||
dataoceanfile_OSTIA_REYNOLDS_SST.90x540.2015.data: | ||
output_file: dataoceanfile_OSTIA_REYNOLDS_SST.90x540.2015.nc4 | ||
im: 90 | ||
var: sst | ||
units: K | ||
long_name: sea_surface_temperature | ||
year: 2015 | ||
grid_type: cube | ||
example_file: /discover/nobackup/bmauer/Example_GEOS_Output/c90_example.20000414_2200z.nc4 | ||
''') | ||
p = argparse.ArgumentParser(description='forcing_converter',epilog=program_description,formatter_class=argparse.RawDescriptionHelpFormatter) | ||
p.add_argument('input_yaml',type=str,help='input file yaml file',default=None) | ||
p.add_argument('-v','--verbose',action='store_true',help='verbose mode') | ||
return vars(p.parse_args()) | ||
|
||
if __name__ == '__main__': | ||
args = parse_args() | ||
input_yaml = args['input_yaml'] | ||
verbose = args['verbose'] | ||
f = open(input_yaml,'r') | ||
files = yaml.safe_load(f) | ||
f.close() | ||
|
||
for file in files: | ||
if verbose: | ||
print(file) | ||
|
||
input_file = file | ||
grid_type = files[file]['grid_type'] | ||
try: | ||
if grid_type == "tile": | ||
create_tile_from_map(input_file,files[file]) | ||
elif grid_type == "latlon": | ||
create_latlon_from_map(input_file,files[file]) | ||
elif grid_type == "cube": | ||
create_cube_from_map(input_file,files[file]) | ||
else: | ||
raise ValueError() | ||
except ValueError as err: | ||
print("Incorrect grid type specified") |
Oops, something went wrong.