Skip to content

Commit

Permalink
Update develop-ref after #2150 and #2154. (#2158)
Browse files Browse the repository at this point in the history
Co-authored-by: Julie Prestopnik <[email protected]>
Co-authored-by: johnhg <[email protected]>
Co-authored-by: Seth Linden <[email protected]>
Co-authored-by: John Halley Gotway <[email protected]>
Co-authored-by: j-opatz <[email protected]>
Co-authored-by: Howard Soh <[email protected]>
Co-authored-by: John Halley Gotway <[email protected]>
Co-authored-by: jprestop <[email protected]>
Co-authored-by: Howard Soh <[email protected]>
Co-authored-by: Randy Bullock <[email protected]>
Co-authored-by: davidfillmore <[email protected]>
Co-authored-by: rgbullock <[email protected]>
Co-authored-by: Seth Linden <[email protected]>
Co-authored-by: George McCabe <[email protected]>
Co-authored-by: Seth Linden <[email protected]>
Co-authored-by: hsoh-u <[email protected]>
Co-authored-by: John Halley Gotway <[email protected]>
Co-authored-by: MET Tools Test Account <[email protected]>
Co-authored-by: mo-mglover <[email protected]>
Co-authored-by: davidalbo <[email protected]>
Co-authored-by: lisagoodrich <[email protected]>
Co-authored-by: Dan Adriaansen <[email protected]>
Co-authored-by: Molly Smith <[email protected]>
  • Loading branch information
22 people authored May 10, 2022
1 parent 8fdc043 commit 7ee599b
Show file tree
Hide file tree
Showing 18 changed files with 244 additions and 50 deletions.
2 changes: 2 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,8 @@ Model Evaluation Tools (MET) Repository
=======================================

<!-- Start of Badges -->
[![Tests](https://github.com/DTCenter/MET/actions/workflows/testing.yml/badge.svg?event=push)](https://github.com/DTCenter/MET/actions/workflows/testing.yml)
[![Docs](https://img.shields.io/badge/Documentation-latest-brightgreen.svg)](https://met.readthedocs.io)
[![DOI](https://zenodo.org/badge/DOI/10.5281/zenodo.5565322.svg)](https://doi.org/10.5281/zenodo.5565322)

This repository contains the source code for the Model Evaluation Tools package (met), unit test code (test), and scripts used to build and test the code (scripts).
Expand Down
26 changes: 14 additions & 12 deletions met/docs/Users_Guide/masking.rst
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ Required arguments for gen_vx_mask

2. The **mask_file** argument defines the masking information, see below.

• For "poly", "box", "circle", and "track" masking, specify an ASCII Lat/Lon file.
• For "poly", "poly_xy", "box", "circle", and "track" masking, specify an ASCII Lat/Lon file.

• For "grid" and "data" masking, specify a gridded data file.

Expand Down Expand Up @@ -81,7 +81,7 @@ Optional arguments for gen_vx_mask

• For "lat" and "lon" masking, threshold the latitude and longitude values.

10. The **-height n** and **-width n** options set the size in grid units for "box"masking.
10. The **-height n** and **-width n** options set the size in grid units for "box" masking.

11. The **-shapeno n** option is only used for shapefile masking. (See description of shapefile masking below).

Expand All @@ -97,25 +97,27 @@ Optional arguments for gen_vx_mask

The Gen-Vx-Mask tool supports the following types of masking region definition selected using the **-type** command line option:

1. Polyline (**poly**) masking reads an input ASCII file containing Lat/Lon locations, connects the first and last points, and selects grid points falling inside that polyline. This option is useful when defining geographic subregions of a domain.
1. Polyline (**poly**) masking reads an input ASCII file containing Lat/Lon locations, connects the first and last points, and selects grid points whose Lat/Lon location falls inside that polyline in Lat/Lon space. This option is useful when defining geographic subregions of a domain.

2. Box (**box**) masking reads an input ASCII file containing Lat/Lon locations and draws a box around each point. The height and width of the box is specified by the **-height** and **-width** command line options in grid units. For a square, only one of **-height** or **-width** needs to be used.
2. Polyline XY (**poly_xy**) masking reads an input ASCII file containing Lat/Lon locations. It converts the polyline Lat/Lon locations into grid X/Y space and connects the first and last points. It selects grid points whose X/Y location falls inside that polyline in X/Y space. This option is useful when defining geographic subregions of a domain.

3. Circle (**circle**) masking reads an input ASCII file containing Lat/Lon locations and for each grid point, computes the minimum great-circle arc distance in kilometers to those points. If the **-thresh** command line option is not used, the minimum distance value for each grid point will be written to the output. If it is used, only those grid points whose minimum distance meets the threshold criteria will be selected. This option is useful when defining areas within a certain radius of radar locations.
3. Box (**box**) masking reads an input ASCII file containing Lat/Lon locations and draws a box around each point. The height and width of the box is specified by the **-height** and **-width** command line options in grid units. For a square, only one of **-height** or **-width** needs to be used.

4. Track (**track**) masking reads an input ASCII file containing Lat/Lon locations and for each grid point, computes the minimum great-circle arc distance in kilometers to the track defined by those points. The first and last track points are not connected. As with **circle** masking the output for each grid point depends on the use of the **-thresh** command line option. This option is useful when defining the area within a certain distance of a hurricane track.
4. Circle (**circle**) masking reads an input ASCII file containing Lat/Lon locations and for each grid point, computes the minimum great-circle arc distance in kilometers to those points. If the **-thresh** command line option is not used, the minimum distance value for each grid point will be written to the output. If it is used, only those grid points whose minimum distance meets the threshold criteria will be selected. This option is useful when defining areas within a certain radius of radar locations.

5. Grid (**grid**) masking reads an input gridded data file, extracts the field specified using its grid definition, and selects grid points falling inside that grid. This option is useful when using a model nest to define the corresponding area of the parent domain.
5. Track (**track**) masking reads an input ASCII file containing Lat/Lon locations and for each grid point, computes the minimum great-circle arc distance in kilometers to the track defined by those points. The first and last track points are not connected. As with **circle** masking the output for each grid point depends on the use of the **-thresh** command line option. This option is useful when defining the area within a certain distance of a hurricane track.

6. Data (**data**) masking reads an input gridded data file, extracts the field specified using the **-mask_field** command line option, thresholds the data using the **-thresh** command line option, and selects grid points which meet that threshold criteria. The option is useful when thresholding topography to define a mask based on elevation or when threshold land use to extract a particular category.
6. Grid (**grid**) masking reads an input gridded data file, extracts the field specified using its grid definition, and selects grid points falling inside that grid. This option is useful when using a model nest to define the corresponding area of the parent domain.

7. Solar altitude (**solar_alt**) and solar azimuth (**solar_azi**) masking computes the solar altitude and azimuth values at each grid point for the time defined by the **mask_file** setting. **mask_file** may either be set to an explicit time string in YYYYMMDD[_HH[MMSS]] format or to a gridded data file. If set to a gridded data file, the **-mask_field** command line option specifies the field of data whose valid time should be used. If the **-thresh** command line option is not used, the raw solar altitude or azimuth value for each grid point will be written to the output. If it is used, the resulting binary mask field will be written. This option is useful when defining a day/night mask.
7. Data (**data**) masking reads an input gridded data file, extracts the field specified using the **-mask_field** command line option, thresholds the data using the **-thresh** command line option, and selects grid points which meet that threshold criteria. The option is useful when thresholding topography to define a mask based on elevation or when threshold land use to extract a particular category.

8. Latitude (**lat**) and longitude (**lon**) masking computes the latitude and longitude value at each grid point. This logic only requires the definition of the grid, specified by the **input_file**. Technically, the **mask_file** is not needed, but a value must be specified for the command line to parse correctly. Users are advised to simply repeat the **input_file** setting twice. If the **-thresh** command line option is not used, the raw latitude or longitude values for each grid point will be written to the output. This option is useful when defining latitude or longitude bands over which to compute statistics.
8. Solar altitude (**solar_alt**) and solar azimuth (**solar_azi**) masking computes the solar altitude and azimuth values at each grid point for the time defined by the **mask_file** setting. **mask_file** may either be set to an explicit time string in YYYYMMDD[_HH[MMSS]] format or to a gridded data file. If set to a gridded data file, the **-mask_field** command line option specifies the field of data whose valid time should be used. If the **-thresh** command line option is not used, the raw solar altitude or azimuth value for each grid point will be written to the output. If it is used, the resulting binary mask field will be written. This option is useful when defining a day/night mask.

9. Shapefile (**shape**) masking uses a closed polygon taken from an ESRI shapefile to define the masking region. Gen-Vx-Mask reads the shapefile with the ".shp" suffix and extracts the latitude and longitudes of the vertices. The other types of shapefiles (index file, suffix ".shx", and dBASE file, suffix ".dbf") are not currently used. The shapefile must consist of closed polygons rather than polylines, points, or any of the other data types that shapefiles support. Shapefiles usually contain more than one polygon, and the **-shape n** command line option enables the user to select one polygon from the shapefile. The integer **n** tells which shape number to use from the shapefile. Note that this value is zero-based, so that the first polygon in the shapefile is polygon number 0, the second polygon in the shapefile is polygon number 1, etc. For the user's convenience, some utilities that perform human-readable screen dumps of shapefile contents are provided. The gis_dump_shp, gis_dump_shx and gis_dump_dbf tools enable the user to examine the contents of her shapefiles. As an example, if the user knows the name of the particular polygon but not the number of the polygon in the shapefile, the user can use the gis_dump_dbf utility to examine the names of the polygons in the shapefile. The information written to the screen will display the corresponding polygon number.
9. Latitude (**lat**) and longitude (**lon**) masking computes the latitude and longitude value at each grid point. This logic only requires the definition of the grid, specified by the **input_file**. Technically, the **mask_file** is not needed, but a value must be specified for the command line to parse correctly. Users are advised to simply repeat the **input_file** setting twice. If the **-thresh** command line option is not used, the raw latitude or longitude values for each grid point will be written to the output. This option is useful when defining latitude or longitude bands over which to compute statistics.

The polyline, box, circle, and track masking methods all read an ASCII file containing Lat/Lon locations. Those files must contain a string, which defines the name of the masking region, followed by a series of whitespace-separated latitude (degrees north) and longitude (degree east) values.
10. Shapefile (**shape**) masking uses a closed polygon taken from an ESRI shapefile to define the masking region. Gen-Vx-Mask reads the shapefile with the ".shp" suffix and extracts the latitude and longitudes of the vertices. The other types of shapefiles (index file, suffix ".shx", and dBASE file, suffix ".dbf") are not currently used. The shapefile must consist of closed polygons rather than polylines, points, or any of the other data types that shapefiles support. Shapefiles usually contain more than one polygon, and the **-shape n** command line option enables the user to select one polygon from the shapefile. The integer **n** tells which shape number to use from the shapefile. Note that this value is zero-based, so that the first polygon in the shapefile is polygon number 0, the second polygon in the shapefile is polygon number 1, etc. For the user's convenience, some utilities that perform human-readable screen dumps of shapefile contents are provided. The gis_dump_shp, gis_dump_shx and gis_dump_dbf tools enable the user to examine the contents of her shapefiles. As an example, if the user knows the name of the particular polygon but not the number of the polygon in the shapefile, the user can use the gis_dump_dbf utility to examine the names of the polygons in the shapefile. The information written to the screen will display the corresponding polygon number.

The polyline, polyline XY, box, circle, and track masking methods all read an ASCII file containing Lat/Lon locations. Those files must contain a string, which defines the name of the masking region, followed by a series of whitespace-separated latitude (degrees north) and longitude (degree east) values.

The Gen-Vx-Mask tool performs three main steps, described below.

Expand Down
8 changes: 8 additions & 0 deletions met/docs/Users_Guide/reformat_grid.rst
Original file line number Diff line number Diff line change
Expand Up @@ -161,6 +161,14 @@ The Pcp-Combine tool will search for 24 files containing 1-hourly accumulation i
This command would grab the first level of the TT variable from a pinterp NetCDF file and write it to the output tt_10.nc file.

**Example 4:**

.. code-block:: none
pcp_combine -subtract 2022043018_48.grib2 'name="APCP"; level="A48";' 2022043018_36.grib2 'name="APCP"; level="A36";' sample_fcst.nc
The Pcp-Combine tool will subtract the 36 hour precipitation accumulations in the file 2022043018_36.grib2 (a 36hr forecast initialized at 2022-04-30 18Z) from the 48 hour accumulations in the file 2022043018_48.grib2 (a 48hr forecast from the same model cycle). This will produce the 12 hour accumulation amounts for the period in between the 36 and 48 hour forecasts. It will write out a single NetCDF file containing that 12 hours of accumulation.

pcp_combine output
------------------

Expand Down
8 changes: 4 additions & 4 deletions met/docs/Users_Guide/reformat_point.rst
Original file line number Diff line number Diff line change
Expand Up @@ -442,10 +442,10 @@ The default ASCII point observation format consists of one row of data per obser
- Description
* - 1
- Message_Type
- Text string containing the observation message type as described in the previous section on the PB2NC tool.
- Text string containing the observation message type as described in the previous section on the PB2NC tool (max 40 characters).
* - 2
- Station_ID
- Text string containing the station id.
- Text string containing the station id (max 40 characters).
* - 3
- Valid_Time
- Text string containing the observation valid time in YYYYMMDD_HHMMSS format.
Expand All @@ -460,7 +460,7 @@ The default ASCII point observation format consists of one row of data per obser
- Elevation in msl of the observing location.
* - 7
- GRIB_Code or Variable_Name
- Integer GRIB code value or variable name corresponding to this observation type.
- Integer GRIB code value or variable name (max 40 characters) corresponding to this observation type.
* - 8
- Level
- Pressure level in hPa or accumulation interval in hours for the observation value.
Expand All @@ -469,7 +469,7 @@ The default ASCII point observation format consists of one row of data per obser
- Height in msl or agl of the observation value.
* - 10
- QC_String
- Quality control value.
- Quality control value (max 16 characters).
* - 11
- Observation_Value
- Observation value in units consistent with the GRIB code definition.
Expand Down
9 changes: 4 additions & 5 deletions met/src/basic/vx_util/data_plane_util.cc
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,6 @@ using namespace std;

#include "GridTemplate.h"


////////////////////////////////////////////////////////////////////////
//
// Utility functions operating on a DataPlane
Expand Down Expand Up @@ -206,7 +205,7 @@ DataPlane smooth_field(const DataPlane &dp,
////////////////////////////////////////////////////////////////////////

void fractional_coverage(const DataPlane &dp, DataPlane &frac_dp,
int width, const GridTemplateFactory::GridTemplates shape,
int width, GridTemplateFactory::GridTemplates shape,
bool wrap_lon, SingleThresh t,
const DataPlane *cmn, const DataPlane *csd, double vld_t) {
GridPoint *gp = NULL;
Expand Down Expand Up @@ -251,9 +250,9 @@ void fractional_coverage(const DataPlane &dp, DataPlane &frac_dp,
}
}

#pragma omp parallel default(none) \
shared(mlog, dp, frac_dp, width, wrap_lon, t) \
shared(use_climo, cmn, csd, vld_t, bad) \
#pragma omp parallel default(none) \
shared(mlog, dp, frac_dp, shape, width, wrap_lon, t) \
shared(use_climo, cmn, csd, vld_t, bad) \
private(x, y, n_vld, n_thr, gp, v)
{

Expand Down
2 changes: 1 addition & 1 deletion met/src/basic/vx_util/data_plane_util.h
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ extern DataPlane smooth_field(const DataPlane &dp,
bool wrap_lon, double t, const GaussianInfo &gaussian);

extern void fractional_coverage(const DataPlane &dp, DataPlane &frac_dp,
int width, const GridTemplateFactory::GridTemplates shape,
int width, GridTemplateFactory::GridTemplates shape,
bool wrap_lon, SingleThresh t,
const DataPlane *cmn, const DataPlane *csd, double vld_t);

Expand Down
17 changes: 13 additions & 4 deletions met/src/basic/vx_util/thresh_array.cc
Original file line number Diff line number Diff line change
Expand Up @@ -625,7 +625,10 @@ bool check_prob_thresh(const ThreshArray &ta, bool error_out) {
mlog << Error << "\ncheck_prob_thresh() -> "
<< "When verifying a probability field, you must "
<< "select at least 3 thresholds beginning with 0.0 "
<< "and ending with 1.0.\n\n";
<< "and ending with 1.0 (current setting: "
<< ta.get_str() << ").\n"
<< "Consider using the ==p shorthand notation for bins "
<< "of equal width.\n\n";
exit(1);
}
else {
Expand All @@ -641,7 +644,10 @@ bool check_prob_thresh(const ThreshArray &ta, bool error_out) {
mlog << Error << "\ncheck_prob_thresh() -> "
<< "When verifying a probability field, all "
<< "thresholds must be greater than or equal to, "
<< "using \"ge\" or \">=\".\n\n";
<< "using \"ge\" or \">=\" (current setting: "
<< ta.get_str() << ").\n"
<< "Consider using the ==p shorthand notation for bins "
<< "of equal width.\n\n";
exit(1);
}
else {
Expand All @@ -653,8 +659,11 @@ bool check_prob_thresh(const ThreshArray &ta, bool error_out) {
if(ta[i].get_value() < 0.0 || ta[i].get_value() > 1.0) {
if(error_out) {
mlog << Error << "\ncheck_prob_thresh() -> "
<< "When verifying a probability field, all "
<< "thresholds must be between 0 and 1.\n\n";
<< "When verifying a probability field, all thresholds "
<< "must be between 0 and 1 (current setting: "
<< ta.get_str() << ").\n"
<< "Consider using the ==p shorthand notation for bins "
<< "of equal width.\n\n";
exit(1);
}
else {
Expand Down
22 changes: 20 additions & 2 deletions met/src/libcode/vx_nc_util/nc_utils.cc
Original file line number Diff line number Diff line change
Expand Up @@ -25,8 +25,6 @@ using namespace netCDF::exceptions;

////////////////////////////////////////////////////////////////////////

////////////////////////////////////////////////////////////////////////

void patch_nc_name(string *var_name) {
size_t offset;

Expand Down Expand Up @@ -3216,6 +3214,26 @@ NcVar add_var(NcFile *nc, const string &var_name, const NcType ncType,
mlog << Debug(3) << " nc_utils.add_var() deflate_level: " << deflate_level << "\n";
var.setCompression(false, true, deflate_level);
}

// Check for lat and lon dimensions
ConcatString cs;
bool has_lat_dim, has_lon_dim;
vector<NcDim>::const_iterator itDim;
for (itDim = ncDims.begin(), has_lat_dim = has_lon_dim = false;
itDim != ncDims.end(); ++itDim) {
if (itDim->getName() == "lat") has_lat_dim = true;
else if (itDim->getName() == "lon") has_lon_dim = true;
if (itDim != ncDims.begin()) cs << " ";
cs << itDim->getName();
}

// Add the coordinates variable attribute for variables
// with both lat and lon dimensions
if (has_lat_dim && var_name != "lat" &&
has_lon_dim && var_name != "lon") {
add_att(&var, "coordinates", cs.c_str());
}

return var;
}

Expand Down
Loading

0 comments on commit 7ee599b

Please sign in to comment.