diff --git a/.gitignore b/.gitignore index a271a97..7564616 100644 --- a/.gitignore +++ b/.gitignore @@ -1,2 +1,3 @@ *~ .vs +/.DS_Store diff --git a/docs/.gitignore b/docs/.gitignore index e35d885..a1516d0 100644 --- a/docs/.gitignore +++ b/docs/.gitignore @@ -1 +1,2 @@ _build +/.DS_Store diff --git a/docs/Users_Guide/configuration.rst b/docs/Users_Guide/configuration.rst index d7022e6..95a0225 100644 --- a/docs/Users_Guide/configuration.rst +++ b/docs/Users_Guide/configuration.rst @@ -64,7 +64,7 @@ Use this for the vars_io.txt file for the Hurricane Matthew case, from the Githu https://github.com/NCAR/i-wrf/blob/main/use_cases/Hurricane_Matthew/WRF/vars_io.txt ^^^^^^^^^^^^^^^^^^^ -METPlus Config File +METplus Config File ^^^^^^^^^^^^^^^^^^^ For the METplus configuration file for the Hurricane Matthew case, please use this file on the Github repository: diff --git a/docs/Users_Guide/matthewjetstream.rst b/docs/Users_Guide/matthewjetstream.rst index 8f0ae79..58e868b 100644 --- a/docs/Users_Guide/matthewjetstream.rst +++ b/docs/Users_Guide/matthewjetstream.rst @@ -8,21 +8,24 @@ Running I-WRF On Jetstream2 with Hurricane Matthew Data Overview ======== -The following instructions can be used to run -the `I-WRF weather simulation program `_ +The following instructions can be used to run elements of +the `I-WRF weather simulation framework `_ from the `National Center for Atmospheric Research (NCAR) `_ +and the `Cornell Center for Advanced Computing `_. +The steps below run the `Weather Research & Forecasting (WRF) `_ model +and the `METplus `_ verification framework with data from `Hurricane Matthew `_ on the `Jetstream2 cloud computing platform `_. This exercise provides an introduction to using cloud computing platforms, -running computationally complex simulations and using containerized applications. +running computationally complex simulations and analyses, and using containerized applications. -Simulations like I-WRF often require greater computing resources +Simulations like WRF often require greater computing resources than you may have on your personal computer, but a cloud computing platform can provided the needed computational power. Jetstream2 is a national cyberinfrastructure resource that is easy to use and is available to researchers and educators. -This exercise runs the I-WRF program as a Docker "container", -which simplifies the set-up work needed to run the simulation. +This exercise runs the I-WRF programs as Docker "containers", +which simplifies the set-up work needed to run the simulation and verification. It is recommended that you follow the instructions in each section in the order presented to avoid encountering issues during the process. @@ -76,7 +79,7 @@ Create a Cloud Instance and Log In ================================== After you have logged in to Jetstream2 and added your allocation to your account, -you are ready to create the cloud instance where you will run the I-WRF simulation. +you are ready to create the cloud instance where you will run the simulation and verification. If you are not familiar with the cloud computing terms "image" and "instance", it is recommended that you `read about them `__ before proceeding. @@ -123,10 +126,10 @@ In either case you will need to know the location and name of the private SSH ke the IP address of your instance (found in the Exosphere web dashboard) and the default username on your instance, which is "exouser". -Once you are logged in to the web shell you can proceed to the +Once you are logged in to the instance you can proceed to the "Install Software and Download Data" section below. You will know that your login has been successful when the prompt has the form ``exouser@instance-name:~$``, -which indicates your username, the instance name, and your current working directory, followed by "$" +which indicates your username, the instance name, and your current working directory, followed by "$". Managing a Jetstream2 Instance ------------------------------ @@ -150,34 +153,81 @@ Increasing the number of CPUs (say, to flavor "m3.8") can make your computations But of course, doubling the number of CPUs doubles the cost per hour to run the instance, so Shelving as soon as you are done becomes even more important! -Install Software and Download Data -================================== +Preparing the Environment +========================= -With your instance created and running and you logged in to it through a Web Shell, -you can now install the necessary software and download the data to run the simulation. +With your instance created and running and you logged in to it through SSH, +you can now create the run folders, install Docker software and download the data to run the simulation and verification. You will only need to perform these steps once, as they essentially change the contents of the instance's disk and those changes will remain even after the instance is shelved and unshelved. -The following sections instruct you to issue numerous Linux commands in your web shell. +The following sections instruct you to issue numerous Linux commands in your shell. If you are not familiar with Linux, you may want to want to refer to `An Introduction to Linux `_ when working through these steps. The commands in each section can be copied using the button in the upper right corner -and then pasted into your web shell by right-clicking. +and then pasted into your shell by right-clicking. -If your web shell ever becomes unresponsive or disconnected from the instance, +If your shell ever becomes unresponsive or disconnected from the instance, you can recover from that situation by rebooting the instance. In the Exosphere dashboard page for your instance, in the Actions menu, select "Reboot". The process takes several minutes, after which the instance status will return to "Ready". -Install Docker and Get the I-WRF Image +Define Environment Variables +---------------------------- + +We will be using some environment variables throughout this exercise to +make sure that we refer to the same resource names and file paths wherever they are used. +Copy and paste the definitions below into your shell to define the variables before proceeding:: + + WRF_IMAGE=ncar/iwrf:latest + METPLUS_IMAGE=dtcenter/metplus-dev:develop + WORKING_DIR=/home/exouser + WRF_DIR=${WORKING_DIR}/wrf/20161006_00 + METPLUS_DIR=${WORKING_DIR}/metplus + WRF_CONFIG_DIR=${WORKING_DIR}/i-wrf/use_cases/Hurricane_Matthew/WRF + METPLUS_CONFIG_DIR=${WORKING_DIR}/i-wrf/use_cases/Hurricane_Matthew/METplus + OBS_DATA_VOL=data-matthew-input-obs + +Any time you open a new shell on your instance, you will need to perform this action +to redefine the variables before executing the commands that follow. + +Create the WRF and METplus Run Folders -------------------------------------- -As mentioned above, the I-WRF simulation application is provided as a Docker image that will run as a +The simulation is performed using a script that expects to run in a folder where it can create result files. +The first command below creates a folder (named "wrf") under the user's home directory, +and a sub-folder within "wrf" to hold the output of this simulation. +The subfolder is named "20161006_00", which is the beginning date and time of the simulation. +Similarly, a run folder named "metplus" must be created for the METplus process to use:: + + mkdir -p ${WRF_DIR} + mkdir -p ${METPLUS_DIR} + +Download Configuration Files +---------------------------- + +Both WRF and METplus require some configuration files to direct their behavior, +and those are downloaded from the I-WRF GitHub repository. +Some of those configuration files are then copied into the run folders. +These commands perform the necessary operations:: + + git clone https://github.com/NCAR/i-wrf ${WORKING_DIR}/i-wrf + cp ${WRF_CONFIG_DIR}/namelist.* ${WRF_DIR} + cp ${WRF_CONFIG_DIR}/vars_io.txt ${WRF_DIR} + cp ${WRF_CONFIG_DIR}/run.sh ${WRF_DIR} + +Install Docker and Pull Docker Objects +====================================== + +Install Docker +-------------- + +As mentioned above, the WRF and METplus software are provided as Docker images that will run as a `"container" `_ on your cloud instance. To run a Docker container, you must first install the Docker Engine on your instance. -You can then "pull" (download) the I-WRF image that will be run as a container. +You can then "pull" (download) the WRF and METplus images that will be run as containers. The `instructions for installing Docker Engine on Ubuntu `_ are very thorough and make a good reference, but we only need to perform a subset of those steps. @@ -186,74 +236,101 @@ then installs Docker:: curl --location https://bit.ly/3R3lqMU > install-docker.sh source install-docker.sh + rm install-docker.sh If a text dialog is displayed asking which services should be restarted, type ``Enter``. When the installation is complete, you can verify that the Docker command line tool works by asking for its version:: docker --version -Next, you must start the Docker daemon, which runs in the background and processes commands:: +The Docker daemon should start automatically, but it sometimes runs into issues. +First, check to see if the daemon started successfully:: - sudo service docker start + sudo systemctl --no-pager status docker -If that command appeared to succeed, you can confirm its status with this command:: +If you see a message saying the daemon failed to start because a "Start request repeated too quickly", +wait a few minutes and issue this command to try again to start it:: - sudo systemctl --no-pager status docker + sudo systemctl start docker + +If the command seems to succeed, confirm that the daemon is running using the status command above. +Repeat these efforts as necessary until it is started. -Once all of that is in order, you must pull the latest version of the I-WRF image onto your instance:: +Get the WRF and METplus Docker Images and the Observed Weather Data +------------------------------------------------------------------- - docker pull ncar/iwrf +Once Docker is running, you must pull the correct versions of the WRF and METplus images onto your instance:: -Get the Geographic Data ------------------------ + docker pull ${WRF_IMAGE} + docker pull ${METPLUS_IMAGE} -To run I-WRF on the Hurricane Matthew data set, you need a copy of the -geographic data representing the terrain in the area of the simulation. -These commands download an archive file containing that data, -uncompress the archive into a folder named "WPS_GEOG", and delete the archive file. -They take several minutes to complete:: +METplus is run to perform verification of the results of the WRF simulation using +observations gathered during Hurricane Matthew. +We download that data by pulling a Docker volume that holds it, +and then referencing that volume when we run the METplus Docker container. +The commands to pull and create the volume are:: + docker pull ncar/iwrf:${OBS_DATA_VOL}.docker + docker create --name ${OBS_DATA_VOL} ncar/iwrf:${OBS_DATA_VOL}.docker + +Download Data for WRF +===================== + +To run WRF on the Hurricane Matthew data set, you need to have +several data sets to support the computation. +The commands in these sections download archive files containing that data, +then uncompress the archives into folders. +The geographic data is large and takes several minutes to acquire, +while the other two data sets are smaller and are downloaded directly into the WRF run folder, +rather than the user's home directory. + +Get the geographic data representing the terrain in the area of the simulation:: + + cd ${WORKING_DIR} wget https://www2.mmm.ucar.edu/wrf/src/wps_files/geog_high_res_mandatory.tar.gz tar -xzf geog_high_res_mandatory.tar.gz rm geog_high_res_mandatory.tar.gz -Create the Run Folder ---------------------- +Get the case study data (GRIB2 files):: + + cd ${WRF_DIR} + wget https://www2.mmm.ucar.edu/wrf/TUTORIAL_DATA/matthew_1deg.tar.gz + tar -xvzf matthew_1deg.tar.gz + rm -f matthew_1deg.tar.gz -The simulation is performed using a script that must first be downloaded. -The script expects to run in a folder where it can download data files and create result files. -The instructions in this exercise create that folder in the user's home directory and name it "matthew". -The simulation script is called "run.sh". -The following commands create the empty folder and download the script into it, -then change its permissions so it can be run:: +Get the SST (Sea Surface Temperature) data:: - mkdir matthew - curl --location https://bit.ly/3KoBtRK > matthew/run.sh - chmod 775 matthew/run.sh + cd ${WRF_DIR} + wget https://www2.mmm.ucar.edu/wrf/TUTORIAL_DATA/matthew_sst.tar.gz + tar -xzvf matthew_sst.tar.gz + rm -f matthew_sst.tar.gz -Run I-WRF -========= +Run WRF +======= With everything in place, you are now ready to run the Docker container that will perform the simulation. The downloaded script runs inside the container, prints lots of status information, and creates output files in the run folder you created. -Execute this command to run the simulation in your web shell:: +Execute this command to run the simulation in your shell:: - time docker run --shm-size 14G -it -v ~/:/home/wrfuser/terrestrial_data -v ~/matthew:/tmp/hurricane_matthew ncar/iwrf:latest /tmp/hurricane_matthew/run.sh + docker run --shm-size 14G -it \ + -v ${WORKING_DIR}:/home/wrfuser/terrestrial_data \ + -v ${WRF_DIR}:/tmp/hurricane_matthew \ + ${WRF_IMAGE} /tmp/hurricane_matthew/run.sh The command has numerous arguments and options, which do the following: -* ``time docker run`` prints the runtime of the "docker run" command. +* ``docker run`` creates the container if needed and then runs it. * ``--shm-size 14G -it`` tells the command how much shared memory to use, and to run interactively in the shell. * The ``-v`` options map folders in your cloud instance to paths within the container. * ``ncar/iwrf:latest`` is the Docker image to use when creating the container. * ``/tmp/hurricane_matthew/run.sh`` is the location within the container of the script that it runs. The simulation initially prints lots of information while initializing things, then settles in to the computation. -The provided configuration simulates 12 hours of weather and takes under three minutes to finish on an m3.quad Jetstream2 instance. -Once completed, you can view the end of any of the output files to confirm that it succeeded:: +The provided configuration simulates 48 hours of weather and takes about 12 minutes to finish on an m3.quad Jetstream2 instance. +Once completed, you can view the end of an output file to confirm that it succeeded:: - tail matthew/rsl.out.0000 + tail ${WRF_DIR}/rsl.out.0000 The output should look something like this:: @@ -268,3 +345,32 @@ The output should look something like this:: Timing for Writing wrfout_d01_2016-10-06_12:00:00 for domain 1: 0.32534 elapsed seconds d01 2016-10-06_12:00:00 wrf: SUCCESS COMPLETE WRF +Run METplus +=========== + +After the WRF simulation has finished, you can run the METplus verification to compare the simulated results +to the actual weather observations during the hurricane. +The verification takes about five minutes to complete. +We use command line options to tell the METplus container several things, including where the observed data is located, +where the METplus configuration can be found, where the WRF output data is located, and where it should create its output files:: + + docker run --rm -it \ + --volumes-from ${OBS_DATA_VOL} \ + -v ${METPLUS_CONFIG_DIR}:/config \ + -v ${WORKING_DIR}/wrf:/data/input/wrf \ + -v ${METPLUS_DIR}:/data/output ${METPLUS_IMAGE} \ + /metplus/METplus/ush/run_metplus.py /config/PointStat_matthew.conf + +Progress information is displayed while the verification is performed. +**WARNING** log messages are expected because observations files are not available for every valid time and METplus is +configured to allow some missing inputs. An **ERROR** log message indicates that something went wrong. +METplus first converts the observation data files to a format that the MET tools can read using the MADIS2NC wrapper. +Point-Stat is run to generate statistics comparing METAR observations to surface-level model fields and +RAOB observations to "upper air" fields. +METplus will print its completion status when the processing finishes. + +The results of the METplus verification can be found in ``${WORKING_DIR}/metplus/point_stat``. +These files contain tabular output that can be viewed in a text editor. Turn off word wrapping for better viewing. +Refer to the MET User's Guide for more information about the +`Point-Stat output `_. +In the near future, this exercise will be extended to include instructions to visualize the results. diff --git a/use_cases/Hurricane_Matthew/WRF/namelist.input b/use_cases/Hurricane_Matthew/WRF/namelist.input index 598ccda..9c544a7 100644 --- a/use_cases/Hurricane_Matthew/WRF/namelist.input +++ b/use_cases/Hurricane_Matthew/WRF/namelist.input @@ -3,24 +3,24 @@ run_hours = 48, run_minutes = 0, run_seconds = 0, - start_year = 2016, 2016, - start_month = 10, 10, - start_day = 06, 06, - start_hour = 00, 00, - end_year = 2016, 2016, - end_month = 10, 10, - end_day = 08, 08, - end_hour = 00, 00, - interval_seconds = 21600, - input_from_file = .true.,.true., - history_interval = 180, 180, - frames_per_outfile = 1, 1, + start_year = 2016, + start_month = 10, + start_day = 06, + start_hour = 00, + end_year = 2016, + end_month = 10, + end_day = 08, + end_hour = 0, + interval_seconds = 21600 + input_from_file = .true., + history_interval = 180, + frames_per_outfile = 1, restart = .false., restart_interval = 1440, - io_form_history = 2, - io_form_restart = 2, - io_form_input = 2, - io_form_boundary = 2, + io_form_history = 2 + io_form_restart = 2 + io_form_input = 2 + io_form_boundary = 2 iofields_filename = "vars_io.txt", "vars_io.txt", auxhist22_outname = "wrfout_zlev_d_", auxhist22_interval = 180, 180, @@ -33,16 +33,16 @@ / &domains - time_step = 90, + time_step = 150, time_step_fract_num = 0, time_step_fract_den = 1, max_dom = 1, - e_we = 91, 220, - e_sn = 100, 214, - e_vert = 45, 45, + e_we = 91, + e_sn = 100, + e_vert = 45, dzstretch_s = 1.1 p_top_requested = 5000, - num_metgrid_levels = 32, + num_metgrid_levels = 32 num_metgrid_soil_levels = 4, dx = 27000, dy = 27000, @@ -105,17 +105,17 @@ / &namelist_quilt - nio_tasks_per_group = 0, - nio_groups = 1, + nio_tasks_per_group = 0, + nio_groups = 1, / &diags - z_lev_diags = 1, - num_z_levels = 6, - z_levels = -80,-100,-200,-300,-400,-500, - p_lev_diags = 1, - num_press_levels = 10, - press_levels = 92500,85000,70000,50000,40000,30000,25000,20000,15000,10000, - use_tot_or_hyd_p = 1, - solar_diagnostics = 0, + z_lev_diags = 1, + num_z_levels = 6, + z_levels = -80,-100,-200,-300,-400,-500 + p_lev_diags = 1, + num_press_levels = 10, + press_levels = 92500,85000,70000,50000,40000,30000,25000,20000,15000,10000 + use_tot_or_hyd_p = 1, + solar_diagnostics = 0, / diff --git a/use_cases/Hurricane_Matthew/WRF/namelist.wps b/use_cases/Hurricane_Matthew/WRF/namelist.wps new file mode 100644 index 0000000..f3408a6 --- /dev/null +++ b/use_cases/Hurricane_Matthew/WRF/namelist.wps @@ -0,0 +1,35 @@ +&share + wrf_core = 'ARW', + max_dom = 1, + start_date = '2016-10-06_00:00:00', + end_date = '2016-10-08_00:00:00', + interval_seconds = 21600 +/ + +&geogrid + parent_id = 1, + parent_grid_ratio = 1, + i_parent_start = 1, + j_parent_start = 1, 25, + e_we = 91, + e_sn = 100, + geog_data_res = 'default', + dx = 27000, + dy = 27000, + map_proj = 'mercator', + ref_lat = 28.00, + ref_lon = -75.00, + truelat1 = 30.0, + truelat2 = 60.0, + stand_lon = -75.0, + geog_data_path = '/home/wrfuser/terrestrial_data/WPS_GEOG' +/ + +&ungrib + out_format = 'WPS', + prefix = 'FILE', +/ + +&metgrid + fg_name = 'FILE' +/ diff --git a/use_cases/Hurricane_Matthew/WRF/run.sh b/use_cases/Hurricane_Matthew/WRF/run.sh new file mode 100755 index 0000000..0e98468 --- /dev/null +++ b/use_cases/Hurricane_Matthew/WRF/run.sh @@ -0,0 +1,60 @@ +#! /bin/bash + +# script adapted from instructions at https://www2.mmm.ucar.edu/wrf/OnLineTutorial/CASES/SingleDomain/ungrib.php +# docker run -it -v /home/hahn/git:/home/wrfuser/git -v /mnt/storage/terrestrial_data:/home/wrfuser/terrestrial_data iwrf:latest /bin/bash + +source /etc/bashrc + +CYCLE_DIR="/tmp/hurricane_matthew" +WPS_DIR="/home/wrfuser/WPS" +WRF_DIR="/home/wrfuser/WRF" + +function main +{ + mkdir -p "${CYCLE_DIR}" + cd "${CYCLE_DIR}" + link_gfs_vtable + run_ungrib + run_geogrid + run_metgrid + run_real + run_wrf +} + +function link_gfs_vtable +{ + ln -sf "${WPS_DIR}/ungrib/Variable_Tables/Vtable.GFS" Vtable + ${WPS_DIR}/link_grib.csh "${CYCLE_DIR}/matthew/*.grib2" +} + +function run_ungrib +{ + ln -s "${WPS_DIR}/ungrib.exe" . 2>/dev/null + ./ungrib.exe +} + +function run_geogrid +{ + ln -s "${WPS_DIR}"/* . 2>/dev/null + ./geogrid.exe +} + +function run_metgrid +{ + ./metgrid.exe +} + +function run_real +{ + ln -s "${WRF_DIR}"/test/em_real/* . 2>/dev/null + ./real.exe +} + +function run_wrf +{ + ulimit -s unlimited + ln -s "${WRF_DIR}"/test/em_real/* . 2>/dev/null + mpirun ./wrf.exe +} + +main