Skip to content

Commit

Permalink
STOFS-3D scripts: added more items in the 3D setup workflow. Tested o…
Browse files Browse the repository at this point in the history
…n Hercules.
  • Loading branch information
feiye-vims committed Nov 23, 2024
1 parent 7981f18 commit 3148995
Show file tree
Hide file tree
Showing 5 changed files with 333 additions and 115 deletions.
Original file line number Diff line number Diff line change
@@ -1,22 +1,32 @@
1. Download adt data
Install the tool "copernicusmarine" following https://help.marine.copernicus.eu/en/articles/7970514-copernicus-marine-toolbox-installation.
Install the tool "copernicusmarine" following https://help.marine.copernicus.eu/en/articles/7970514-copernicus-marine-toolbox-installation.

Sample download command:
copernicusmarine subset --dataset-id cmems_obs-sl_glo_phy-ssh_nrt_allsat-l4-duacs-0.25deg_P1D -v adt -v sla -v err_sla -t 2024-04-01 -T 2024-04-10 -x -66.5 -X -52. -y 6.75 -Y 53.25 -f adt_test.nc --force-download --username xxx --password XXXXXX
"cmems_obs-sl_glo_phy-ssh_nrt_allsat-l4-duacs-0.25deg_P1D" covers period from 31 Dec 2021 to now.
Change dataset-id to "c3s_obs-sl_glo_phy-ssh_my_twosat-l4-duacs-0.25deg_P1D" to download period from "31 Dec 1992 to 6 Jun 2023"
Optional:
- Save login info using "copernicusmarine login"; otherwise you need to provide this info for every download.


Sample script using python api:
see download_adt.py

Sample checking the availability of a variable:
- Sample checking the availability of a variable:
copernicusmarine describe --include-datasets -c adt | less


Download:

- CLI: copernicusmarine subset --dataset-id cmems_obs-sl_glo_phy-ssh_nrt_allsat-l4-duacs-0.25deg_P1D -v adt -v sla -v err_sla -t 2024-04-01 -T 2024-04-10 -x -66.5 -X -52. -y 6.75 -Y 53.25 -f adt_test.nc --force-download --username xxx --password XXXXXX


- Sample script using python api: download_adt.py, see more instructions at the beginning of the script.
The usage is straightforward.

For the STOFS-3D-Atlantic setup:
Use "minimum_longitude = -60, maximum_longitude = -53, minimum_latitude = 1, maximum_latitude = 55,"
This is a slice only covering the ocean boundary, mainly for elev2D.th.nc

Use "minimum_longitude = -105, maximum_longitude = -50, minimum_latitude = 1, maximum_latitude = 55,"
This covers the entire domain. This is mainly used for setting initial elevation, but you don't need to worry about this for now.



Temporal coverage, change dataset-id:
31 Dec 2021 to present: "cmems_obs-sl_glo_phy-ssh_nrt_allsat-l4-duacs-0.25deg_P1D"
31 Dec 1992 to 6 Jun 2023: "c3s_obs-sl_glo_phy-ssh_my_twosat-l4-duacs-0.25deg_P1D" (discontinued on Nov 26, 2024; they probably will have a new id for it)

Replace eta2 in hotstart.nc with ADT:
replace_eta2_aviso.py

Generate elev2D.th.nc with ADT:
gen_bnd_aviso.py (check file name in aviso2schism.py)
modify_elev.py (to apply adjustment in elev)
Original file line number Diff line number Diff line change
Expand Up @@ -17,49 +17,36 @@ class Bctides:
def __init__(
self,
hgrid,
flags: list = None,
bc_flags: list = None,
constituents: Union[str, list] = 'major',
database: str = 'tpxo',
add_earth_tidal: bool = True,
cutoff_depth: float = 50.0,
ethconst: list = None,
vthconst: list = None,
tthconst: list = None,
sthconst: list = None,
tobc: list = None,
sobc: list = None,
relax: list = None,
bc_const: list = None,
bc_relax: list = None,
):
"""Initialize Bctides ojbect
Parameters
--------
hgrid: Hgrid object
flags: nested list of bctypes
bc_flags: nested list of bctypes
constituents: str or list
database: str ('tpxo' or 'fes2014')
add_earth_tidal: bool
cutoff_depth: float
ethconst: list (constant elevation value for each open boundary)
vthconst: list (constant discharge value for each open boundary)
tthconst: list (constant temperature value for each open boundary)
sthconst: list (constant salinity value for each open boundary)
tobc: list (nuding factor of temperature for each open boundary)
sobc: list (nuding factor of salinity for each open boundary)
realx: list (relaxation constants for inflow and outflow)
bc_const: constant values for each variable at each open boundary
(None if not applicable)
bc_relax: relaxation constants for each variable at each open boundary
(None if not applicable)
"""

self.hgrid = hgrid
self.add_earth_tidal = add_earth_tidal
self.cutoff_depth = cutoff_depth
self.tides = Tides(constituents=constituents, tidal_database=database)
self.flags = flags
self.ethconst = ethconst
self.vthconst = vthconst
self.tthconst = tthconst
self.sthconst = sthconst
self.tobc = tobc
self.sobc = sobc
self.relax = relax
self.bc_flags = bc_flags
self.bc_const = bc_const
self.bc_relax = bc_relax

def __str__(self):

Expand Down Expand Up @@ -103,9 +90,9 @@ def __str__(self):

#get amplitude and phase for each open boundary
f.append(f"{len(self.gdf)} !nope")
if len(self.gdf) != len(self.flags):
raise ValueError(f'Number of open boundary {len(self.gdf)} is not consistent with number of given bctypes {len(self.flags)}!')
for ibnd, (boundary, flag) in enumerate(zip(self.gdf.itertuples(), self.flags)):
if len(self.gdf) != len(self.bc_flags):
raise ValueError(f'Number of open boundary {len(self.gdf)} is not consistent with number of given bctypes {len(self.bc_flags)}!')
for ibnd, (boundary, flag) in enumerate(zip(self.gdf.itertuples(), self.bc_flags)):
logger.info(f"Processing boundary {ibnd+1}:")
#number of nodes and flags
line = [
Expand All @@ -127,7 +114,7 @@ def __str__(self):
logger.warning(f'time history of elevation is read in from elev.th (ASCII)!')
elif iettype == 2:
logger.info("You are choosing type 2 for elevation, value is {selfethconst[ibnd]} ")
f.append(f"{self.ethconst[ibnd]}")
f.append(f"{self.bc_const[ibnd][0]}")
elif iettype == 4:
logger.warning('time history of elevation is read in from elev2D.th.nc (netcdf)')
elif iettype == 3 or iettype == 5:
Expand All @@ -152,7 +139,7 @@ def __str__(self):
logger.warning("time history of discharge is read in from flux.th (ASCII)!")
elif ifltype == 2:
logger.info("You are choosing type 2 for velocity, value is {self.vthconst[ibnd]} ")
f.append(f"{self.vthconst[ibnd]}")
f.append(f"{self.bc_const[ibnd][1]}")
elif ifltype == 3 or ifltype == 5:
if ifltype == 5:
logger.warning(f'Combination of 3 and 4, time history of velocity is read in from uv.3D.th.nc!')
Expand All @@ -165,8 +152,8 @@ def __str__(self):
elif ifltype == 4 or -4:
logger.warning("time history of velocity (not discharge!) is read in from uv3D.th.nc (netcdf)")
if ifltype == -4:
logger.info(f"You are using type -4, relaxation constants for inflow is {self.relax[0]}, for outflow is {self.relax[1]}")
f.append(f"{self.relax[0]} {self.relax[1]} !relaxation constant")
logger.info(f"You are using type -4, relaxation constants for inflow is {self.bc_relax[ibnd][0]}, for outflow is {self.bc_relax[ibnd][1]}")
f.append(f"{self.bc_relax[ibnd][0]} {self.bc_relax[ibnd][1]} !relaxation constant")
elif ifltype == -1:
raise NotImplementedError(f"Velocity type {ifltype} not implemented yet!")
#logger.info(f"Flather type radiation b.c. (iettype must be 0 in this case)!")
Expand All @@ -179,25 +166,25 @@ def __str__(self):
#temperature
logger.info(f"Temperature type: {itetype}")
if itetype == 0:
logger.warning("Temperature is not sepcified, not input needed!")
logger.warning("Temperature is not sepcified, no input needed!")
elif itetype == 1:
logger.warning("time history of temperature will be read in from TEM_1.th!")
logger.info(f"Nudging factor for T at boundary {ibnd+1} is {self.tobc[ibnd]}")
f.append(f"{self.tobc[ibnd]} !nudging factor for T")
logger.info(f"Nudging factor for T at boundary {ibnd+1} is {self.bc_relax[ibnd][2]}")
f.append(f"{self.bc_relax[ibnd][2]} !nudging factor for T")
elif itetype == 2:
logger.info("You are choosing type 2 for temperature, value is {self.tthconst[ibnd]} ")
f.append(f"{self.tthconst[ibnd]} !T")
logger.info(f"You are choosing type 2 for temperature, value is {self.bc_const[ibnd]}[2] ")
f.append(f"{self.bc_const[ibnd][2]} !T")

logger.info(f"Nudging factor for T at boundary {ibnd+1} is {self.tobc[ibnd]}")
f.append(f"{self.tobc[ibnd]} !nudging factor for T")
logger.info(f"Nudging factor for T at boundary {ibnd+1} is {self.bc_relax[ibnd][2]}")
f.append(f"{self.bc_relax[ibnd][2]} !nudging factor for T")
elif itetype == 3:
logger.info("Using initial temperature profile for inflow")
logger.info(f"Nudging factor for T at boundary {ibnd+1} is{self.tobc[ibnd]}")
f.append(f"{self.tobc[ibnd]} !nudging factor for T")
logger.info(f"Nudging factor for T at boundary {ibnd+1} is{self.bc_relax[ibnd][2]}")
f.append(f"{self.bc_relax[ibnd][2]} !nudging factor for T")
elif itetype == 4:
logger.warning("time history of temperature is read in from TEM_3D.th.nc (netcdf)!")
logger.info(f"Nudging factor for T at boundary {ibnd+1} is{self.tobc[ibnd]}")
f.append(f"{self.tobc[ibnd]} !nudging factor for T")
logger.info(f"Nudging factor for T at boundary {ibnd+1} is{self.bc_relax[ibnd][2]}")
f.append(f"{self.bc_relax[ibnd][2]} !nudging factor for T")
else:
raise IOError(f"Invalid type {itetype} for salinity!")

Expand All @@ -207,22 +194,22 @@ def __str__(self):
logger.info("Salinity is not sepcified, not input needed!")
elif isatype == 1:
logger.warning("time history of salinity will be read in from SAL_1.th!")
logger.info(f"Nudging factor for salt at boundary {ibnd+1} is {self.sobc[ibnd]}")
f.append(f"{self.sobc[ibnd]} !nudging factor for S")
logger.info(f"Nudging factor for salt at boundary {ibnd+1} is {self.bc_relax[ibnd][3]}")
f.append(f"{self.bc_relax[ibnd][3]} !nudging factor for S")
elif isatype == 2:
logger.info("Yor are choosing type 2 for salinity, value is {self.sthconst[ibnd]} ")
f.append(f"{self.sthconst[ibnd]} !S")
logger.info(f"Yor are choosing type 2 for salinity, value is {self.bc_relax[ibnd][3]} ")
f.append(f"{self.bc_relax[ibnd][3]} !S")

logger.info(f"Nudging factor for salt at boundary {ibnd+1} is {self.sobc[ibnd]}")
f.append(f"{self.sobc[ibnd]} !nudging factor for S")
logger.info(f"Nudging factor for salt at boundary {ibnd+1} is {self.bc_relax[ibnd][3]}")
f.append(f"{self.bc_relax[ibnd][3]} !nudging factor for S")
elif isatype == 3:
logger.info("Using initial salinity profile for inflow")
logger.info(f"Nudging factor for salt at boundary {ibnd+1} is {self.sobc[ibnd]}")
f.append(f"{self.sobc[ibnd]} !nudging factor for S")
logger.info(f"Nudging factor for salt at boundary {ibnd+1} is {self.bc_relax[ibnd][3]}")
f.append(f"{self.bc_relax[ibnd][3]} !nudging factor for S")
elif isatype == 4:
logger.warning("time history of salinity is read in from SAL_3D.th.nc (netcdf)!")
logger.info(f"Nudging factor for salt at boundary {ibnd+1} is {self.sobc[ibnd]}")
f.append(f"{self.sobc[ibnd]} !nudging factor for S")
logger.info(f"Nudging factor for salt at boundary {ibnd+1} is {self.bc_relax[ibnd][3]}")
f.append(f"{self.bc_relax[ibnd][3]} !nudging factor for S")
else:
raise IOError(f"Invalid type {isatype} for salinity!")

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -182,6 +182,7 @@ def cpp2lonlat(x, y, lon0=0, lat0=0):

def relocate_sources2(
old_ss_dir=None, outdir=None,
no_feeder=False,
feeder_info_file=None, hgrid_fname=None, allow_neglection=True,
max_search_radius=1000.0, mandatory_sources_coor=np.empty((0, 4)),
region_list=None,
Expand All @@ -191,6 +192,8 @@ def relocate_sources2(
:old_ss_dir: the directory of the original source_sink files,
which should contain the sources.json file generated by pyschism
:no_feeder: if true, feeder_bases (which should still be present in a mesh without feeders)
will be used; otherwise, feeder_heads will be used.
:outdir: the output directory
:feeder_info_file: the feeder channel information generated by make_feeder_channel.py
:hgrid_fname: the hgrid file name of the new grid with feeders
Expand Down Expand Up @@ -276,7 +279,13 @@ def relocate_sources2(
# -------------------------------------relocation-------------------------------------
# Find matching source point at mandatory_sources_coor and feeders
# These are the desired new source locations
new_sources_coor = np.r_[mandatory_sources_coor[:, :2], feeder_heads[:, :2]]
if no_feeder:
# If feeders are not implemented in the mesh, use "feeder_bases" instead of "feeder_heads",
# because "feeder_bases" are the locations where the feeders would be placed.
# The relocation still works in the sense that only resolved rivers receive new sources.
new_sources_coor = np.r_[mandatory_sources_coor[:, :2], feeder_bases[:, :2]]
else:
new_sources_coor = np.r_[mandatory_sources_coor[:, :2], feeder_heads[:, :2]]
# These are the locations used to search for the closest old source
# ; in other words, to link the new source to the old source.
# For the mandatory sources, these may be different from the new source locations
Expand Down Expand Up @@ -426,7 +435,7 @@ def relocate_sources(
which is useful for the operational forecast to save time,
in this case no need to provide the arguments below;
otherwise, the relocation map will be generated.
: no_feeder: if true, feeder_bases (which should still be present in a mesh without feeders)
:no_feeder: if true, feeder_bases (which should still be present in a mesh without feeders)
will be used; otherwise, feeder_heads will be used.
:allow_neglection: Allow neglecting channels that cannot be matched to a new source location.
This is useful when the relocation is restricted to a specific region.
Expand Down
Loading

0 comments on commit 3148995

Please sign in to comment.