From e340cb700be28ac5fb9a319c2c607faa7f8e3409 Mon Sep 17 00:00:00 2001 From: Ben Trumbore Date: Mon, 20 May 2024 14:36:00 -0400 Subject: [PATCH 01/36] Roll back change to usecases.rst --- docs/Users_Guide/usecases.rst | 3 --- 1 file changed, 3 deletions(-) diff --git a/docs/Users_Guide/usecases.rst b/docs/Users_Guide/usecases.rst index af11bca..dfee7e0 100644 --- a/docs/Users_Guide/usecases.rst +++ b/docs/Users_Guide/usecases.rst @@ -5,9 +5,6 @@ Use Cases Generic CONUS “interesting weather” =================================== -Hurricane Matthew Running on Jetstream2 -=================================== - Land Use/Land Cover Change ========================== From b1ccadaaa3186ab949c4e9b8bea9edb5db4342cf Mon Sep 17 00:00:00 2001 From: Ben Trumbore Date: Mon, 20 May 2024 18:53:48 -0400 Subject: [PATCH 02/36] Draft of the Jetstream2-Matthew tutorial instructions The instructions still need to be retested and the text vetted by Rich. It would be great if we could have a section at the end to validate or see the results. --- .gitignore | 1 + docs/Users_Guide/matthewjetstream.rst | 210 ++++++++++++++++++++++++++ docs/Users_Guide/usecases.rst | 3 + 3 files changed, 214 insertions(+) create mode 100644 docs/Users_Guide/matthewjetstream.rst diff --git a/.gitignore b/.gitignore index b25c15b..a271a97 100644 --- a/.gitignore +++ b/.gitignore @@ -1 +1,2 @@ *~ +.vs diff --git a/docs/Users_Guide/matthewjetstream.rst b/docs/Users_Guide/matthewjetstream.rst new file mode 100644 index 0000000..b23e7f4 --- /dev/null +++ b/docs/Users_Guide/matthewjetstream.rst @@ -0,0 +1,210 @@ +## Running I-WRF On Jetstream2 with Hurricane Matthew Data + +### Overview + +The following instructions can be used to run +the [I-WRF weather simulation program](https://i-wrf.org/) +with data from [Hurricane Matthew](https://en.wikipedia.org/wiki/Hurricane_Matthew) +on the [Jetstream2 cloud computing platform](https://jetstream-cloud.org/). +This exercise provides an introduction to using cloud computing platforms, +running computationally complex simulations and using containerized applications. + +Simulations like I-WRF often require greater computing resources +than you may have on your personal computer, +but a cloud computing platform can provided the needed computational power. +Jetstream2 is a national cyberinfrastructure resource that is easy to use +and is available to researchers and educators. +This example delivers the I-WRF program as a Docker "image", +simplifying the set-up for running the simulation. + +### Prepare to Use Jetstream2 + +To [get started with Jetstream2](https://jetstream-cloud.org/get-started), +you will need to: + ++ Create an account with the [National Science Foundation (NSF)](https://www.nsf.gov/)'s +[ACCESS program](https://access-ci.org/). ++ Request a computational "allocation" from ACCESS. ++ Log in to Jetstream2's web portal. + +The sections below will guide you through this process + +#### Create an ACCESS Account + +If you do not already have one, [register for an ACCESS account](https://operations.access-ci.org/identity/new-user). +Note that you can either choose to use an existing University/Organizational account or +create an entirely new ACCESS account when registering. + +#### Get an Allocation + +With your ACCESS account set up, you may [request an allocation](https://allocations.access-ci.org/get-your-first-project) +that will allow you to use an ACCESS-affiliated cyberinfrastructure resource. +Be sure to read all of the information on that page so that you make a suitable request. +An "Explore" project will be sufficient to work with this example, +and you will want to work with the resource "Indiana Jetstream2 CPU" (*not* GPU). +The typical turnaround time for allocation requests is one business day. + +#### Log in to the Exosphere Web Site + +Once you have an ACCESS account and allocation, +you can log in to their [Exosphere web dashboard](https://jetstream2.exosphere.app/). +The process of identifying your allocation and ACCESS ID to use Jetstream2 +is described on [this page](https://cvw.cac.cornell.edu/jetstream/intro/jetstream-login) of the +[Introduction to Jetstream2](https://cvw.cac.cornell.edu/jetstream) Cornell Virtual Workshop, +and on [this page](https://docs.jetstream-cloud.org/ui/exo/login/) +of the [Jetstream2 documentation](https://docs.jetstream-cloud.org/). + +While adding an allocation to your account, it is recommended that you choose +the "Indiana University" region of Jetstream2 for completing this example. + +### Create a Cloud Instance and Log In + +After you have logged in to Jetstream2 and added your allocation to your account, +you are ready to create the cloud instance where you will run the I-WRF simulation. +If you are not familiar with the cloud computing terms "image" and "instance", +it is recommended that you [read about them](https://cvw.cac.cornell.edu/jetstream/intro/imagesandinstances) +before proceeding. + +#### Create an SSH Key + +If you are not familiar with "SSH key pairs", you should +[read about them](https://cvw.cac.cornell.edu/jetstream/keys/about-keys) before continuing. +A key pair is needed when creating your instance so that you can log in to it, +as password-based log-ins are disabled on Jetstream2. + ++ First, [create an SSH Key on your computer](https://cvw.cac.cornell.edu/jetstream/keys/ssh-create) using the "ssh-keygen" command. ++ Then [upload the key to Jetstream2](https://cvw.cac.cornell.edu/jetstream/keys/ssh-upload) through the Exosphere web interface. + +#### Create an Instance + +The Cornell Virtual Workshop topic [Creating an Instance](https://cvw.cac.cornell.edu/jetstream/create-instance) +provides detailed information about creating a Jetstream2 instance. +While following those steps, be sure to make the following choices for this instance: + ++ Choose the Featured-Ubuntu22 image as the instance source. ++ Choose the "Flavor" m3.quad (4 CPUs) to provide faster a simulation run-time. ++ Select a custom disk size of 100 GB to hold this example's data and results. ++ Select "Yes" for Enable web desktop. ++ Select the SSH public key that you uploaded previously. ++ You do not need to set any of the Advanced Options. + +After clicking the "Create" button, wait for the instance to enter the "Ready" state (it takes several minutes). +Note that the instance will not only be created, but will be running so that you can log in right away. + +#### Log in to the Instance + +The Exosphere web dashboard provides two easy ways to log in to Jetstream2 instances +Web Shell and Web Desktop. +For this example, you can use the [Web Shell](https://cvw.cac.cornell.edu/jetstream/instance-login/webshell) option +to open a terminal tab in your web browser. +You may also want to read about the [features of Guacamole](https://cvw.cac.cornell.edu/jetstream/instance-login/guacamole), +which is the platform that supports both Web Shell and Web Desktop. + +Once you are logged in to the web shell you can proceed to the +"Install Software and Download Data" section below. + +#### Managing a Jetstream2 Instance + +An appropriate aspect of efficient cloud computing is knowing how to +[manage your instances](https://cvw.cac.cornell.edu/jetstream/manage-instance/states-actions). +Instances incur costs whenever they are running (on Jetstream, this is when they are "Ready"). +"Shelving" an instance stops it from using the cloud's CPUs and memory, +and therefore stops it from incurring any charges on your allocation. + +When you are through working on this example, +be sure to use the instance's "Actions" menu in the web dashboard to +"Shelve" the instance so that it is no longer spending your credits. +If you alter return to the dashboard and want to use the instance again, +Use the Action menu's "Unshelve" option to start the instance up again. +Note that any programs that were running when you shelve the instance will be lost, +but the contents of the disk are preserved when shelving. + +You may also want to try the "Resize" action to change the number of CPUs of the instance. +Increasing the number of CPUs (say to flavor "m3.8") can make your computations finish more quickly. +But of course, doubling the number of CPUs doubles the cost per hour to run the instance, +so Shelving as soon as you are done becomes even more important. + +### Install Software and Download Data + +With your instance created and running and you logged in to it through a Web Shell, +you can now install the necessary software and download the data to run the simulation. +You will only need to perform these steps once, +as they essentially change the contents of the instance's disk +and those changes will remain even after the instance is shelved and unshelved. + +#### Install Docker and Get the I-WRF Image + +As mentioned above, the I-WRF simulation application is available as an image that will run as a +[Docker "container"](https://docs.docker.com/guides/docker-concepts/the-basics/what-is-a-container/) +on your instance. +To do so, you must first install the Docker Engine on the instance +and then download, or "pull" the I-WRF image that will be run as a container in Docker. + +The [instructions for installing Docker Engine on Ubuntu](https://docs.docker.com/engine/install/ubuntu/) +are very thorough and make a good reference, but we only need to perform a subset of those steps. +The following commands can be copied and pasted into your shell. +This first, complicated sequence sets up the Docker repository on your instance: + + sudo apt-get install ca-certificates curl + sudo install -m 0755 -d /etc/apt/keyrings + sudo curl -fsSL https://download.docker.com/linux/ubuntu/gpg \ + -o /etc/apt/keyrings/docker.asc + sudo chmod a+r /etc/apt/keyrings/docker.asc + echo \ + "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.asc] \ + https://download.docker.com/linux/ubuntu \ + $(. /etc/os-release && echo "$VERSION_CODENAME") stable" | \ + sudo tee /etc/apt/sources.list.d/docker.list > /dev/null + sudo apt-get update + +Now you can simply install the Docker Engine: + + sudo apt-get install docker-ce docker-ce-cli + +And finally, you pull the latest version of the I-WRF image onto your instance: + + docker pull ncar/iwrf + +#### Get the Geographic Data + +To run I-WRF on the Hurricane Matthew data set, you need a copy of the +geographic data representing the terrain in the area of the simulation. +These commands download an archive file containing that data, +uncompress the archive into a folder named "WPS_GEOG", and delete the archive file. + + wget https://www2.mmm.ucar.edu/wrf/src/wps_files/geog_high_res_mandatory.tar.gz + tar -xzf geog_high_res_mandatory.tar.gz + rm geog_high_res_mandatory.tar.tz + +#### Create the Run Folder + +The simulation is started by a script that must first be downloaded. +The script expects to run in a folder where it can download data files and generate results. +In this example, we expect this folder to be named "matthew" and to be in the user's home directory. +The script is called "run.sh". +The following commands create the empty folder and download the script into it, +and they can be copied and pasted into your web shell. + + mkdir matthew + https://gist.githubusercontent.com/Trumbore/27cef8073048cde7a8142af9bfb0b264/raw/1115ce9de4a30ad665055ed323c40a4e7aa411b2/run.sh > matthew/run.sh + +### Run I-WRF + +With everything in place, you are now ready to run the Docker container that will perform the simulation. +The downloaded script runs inside the container, prints lots of status information, +and creates output files in the run folder you created. +Copy and paste this command into your web shell: + + time docker run --shm-size 14G -it -v ~/:/home/wrfuser/terrestrial_data \ + -v ~/matthew:/tmp/hurricane_matthew ncar/iwrf:latest /tmp/hurricane_matthew/run.sh + +The command has numerous arguments and options, which do the following: + ++ `time docker run` prints the runtime of the "docker run" command. ++ `--shm-size 14G -it` tells the command how much shared memory to use, and to run interactively in the shell. ++ The `-v` options map folders in the instance to paths within the contianer. ++ `ncar/iwrf:latest` is the Docker image to use when creating the container. ++ `/tmp/hurricane_matthew/run.sh` is the location within the container of the script that it runs. + +It takes about 12 minutes for the simulation to finish on an m3.quad Jetstream instance. + diff --git a/docs/Users_Guide/usecases.rst b/docs/Users_Guide/usecases.rst index dfee7e0..f7c7474 100644 --- a/docs/Users_Guide/usecases.rst +++ b/docs/Users_Guide/usecases.rst @@ -5,6 +5,9 @@ Use Cases Generic CONUS “interesting weather” =================================== +[Hurricane Matthew on Jetstream2](matthewjetstream) +=================================== + Land Use/Land Cover Change ========================== From ab76b1875410b0806bca9d3e06d708562318cf44 Mon Sep 17 00:00:00 2001 From: Ben Trumbore Date: Mon, 20 May 2024 19:04:38 -0400 Subject: [PATCH 03/36] Tests to see what Markdown format I should really use --- docs/Users_Guide/matthewjetstream.rst | 22 ++++++++++++++-------- docs/Users_Guide/usecases.rst | 2 +- 2 files changed, 15 insertions(+), 9 deletions(-) diff --git a/docs/Users_Guide/matthewjetstream.rst b/docs/Users_Guide/matthewjetstream.rst index b23e7f4..cf63f6a 100644 --- a/docs/Users_Guide/matthewjetstream.rst +++ b/docs/Users_Guide/matthewjetstream.rst @@ -1,10 +1,14 @@ -## Running I-WRF On Jetstream2 with Hurricane Matthew Data +******************************************************* +Running I-WRF On Jetstream2 with Hurricane Matthew Data +******************************************************* -### Overview +======== +Overview +======== The following instructions can be used to run -the [I-WRF weather simulation program](https://i-wrf.org/) -with data from [Hurricane Matthew](https://en.wikipedia.org/wiki/Hurricane_Matthew) +the `I-WRF weather simulation program https://i-wrf.org` +with data from `Hurricane Matthew https://en.wikipedia.org/wiki/Hurricane_Matthew` on the [Jetstream2 cloud computing platform](https://jetstream-cloud.org/). This exercise provides an introduction to using cloud computing platforms, running computationally complex simulations and using containerized applications. @@ -22,14 +26,16 @@ simplifying the set-up for running the simulation. To [get started with Jetstream2](https://jetstream-cloud.org/get-started), you will need to: -+ Create an account with the [National Science Foundation (NSF)](https://www.nsf.gov/)'s +* Create an account with the [National Science Foundation (NSF)](https://www.nsf.gov/)'s [ACCESS program](https://access-ci.org/). -+ Request a computational "allocation" from ACCESS. -+ Log in to Jetstream2's web portal. + Request a computational "allocation" from ACCESS. +* Log in to Jetstream2's web portal. The sections below will guide you through this process -#### Create an ACCESS Account +------------------------ +Create an ACCESS Account +------------------------ If you do not already have one, [register for an ACCESS account](https://operations.access-ci.org/identity/new-user). Note that you can either choose to use an existing University/Organizational account or diff --git a/docs/Users_Guide/usecases.rst b/docs/Users_Guide/usecases.rst index f7c7474..b4b2fbe 100644 --- a/docs/Users_Guide/usecases.rst +++ b/docs/Users_Guide/usecases.rst @@ -5,7 +5,7 @@ Use Cases Generic CONUS “interesting weather” =================================== -[Hurricane Matthew on Jetstream2](matthewjetstream) +`Hurricane Matthew on Jetstream2 ` =================================== Land Use/Land Cover Change From 7b63228c0f182f23f19a18aefc1d3d48764f586b Mon Sep 17 00:00:00 2001 From: Ben Trumbore Date: Mon, 20 May 2024 19:14:17 -0400 Subject: [PATCH 04/36] Crazy-ass markdown language --- docs/Users_Guide/matthewjetstream.rst | 18 +++++++++++------- 1 file changed, 11 insertions(+), 7 deletions(-) diff --git a/docs/Users_Guide/matthewjetstream.rst b/docs/Users_Guide/matthewjetstream.rst index cf63f6a..94fa06d 100644 --- a/docs/Users_Guide/matthewjetstream.rst +++ b/docs/Users_Guide/matthewjetstream.rst @@ -7,8 +7,8 @@ Overview ======== The following instructions can be used to run -the `I-WRF weather simulation program https://i-wrf.org` -with data from `Hurricane Matthew https://en.wikipedia.org/wiki/Hurricane_Matthew` +the `I-WRF weather simulation program ` +with data from `Hurricane Matthew ` on the [Jetstream2 cloud computing platform](https://jetstream-cloud.org/). This exercise provides an introduction to using cloud computing platforms, running computationally complex simulations and using containerized applications. @@ -21,15 +21,17 @@ and is available to researchers and educators. This example delivers the I-WRF program as a Docker "image", simplifying the set-up for running the simulation. -### Prepare to Use Jetstream2 +========================= +Prepare to Use Jetstream2 +========================= To [get started with Jetstream2](https://jetstream-cloud.org/get-started), you will need to: -* Create an account with the [National Science Foundation (NSF)](https://www.nsf.gov/)'s + * Create an account with the [National Science Foundation (NSF)](https://www.nsf.gov/)'s [ACCESS program](https://access-ci.org/). - Request a computational "allocation" from ACCESS. -* Log in to Jetstream2's web portal. + * Request a computational "allocation" from ACCESS. + * Log in to Jetstream2's web portal. The sections below will guide you through this process @@ -41,7 +43,9 @@ If you do not already have one, [register for an ACCESS account](https://operati Note that you can either choose to use an existing University/Organizational account or create an entirely new ACCESS account when registering. -#### Get an Allocation +--- +Get an Allocation +--- With your ACCESS account set up, you may [request an allocation](https://allocations.access-ci.org/get-your-first-project) that will allow you to use an ACCESS-affiliated cyberinfrastructure resource. From af0c1ff95d673a171f6c09a954835252b90384e4 Mon Sep 17 00:00:00 2001 From: Ben Trumbore Date: Mon, 20 May 2024 19:20:37 -0400 Subject: [PATCH 05/36] Sigh --- docs/Users_Guide/matthewjetstream.rst | 19 ++++++++++++++----- 1 file changed, 14 insertions(+), 5 deletions(-) diff --git a/docs/Users_Guide/matthewjetstream.rst b/docs/Users_Guide/matthewjetstream.rst index 94fa06d..0615573 100644 --- a/docs/Users_Guide/matthewjetstream.rst +++ b/docs/Users_Guide/matthewjetstream.rst @@ -7,9 +7,9 @@ Overview ======== The following instructions can be used to run -the `I-WRF weather simulation program ` +the `I-WRF weather simulation program https://i-wrf.org title="I-WRF weather simulation program` with data from `Hurricane Matthew ` -on the [Jetstream2 cloud computing platform](https://jetstream-cloud.org/). +on the Jetstream2 cloud computing platform (https://jetstream-cloud.org/). This exercise provides an introduction to using cloud computing platforms, running computationally complex simulations and using containerized applications. @@ -28,8 +28,7 @@ Prepare to Use Jetstream2 To [get started with Jetstream2](https://jetstream-cloud.org/get-started), you will need to: - * Create an account with the [National Science Foundation (NSF)](https://www.nsf.gov/)'s -[ACCESS program](https://access-ci.org/). + * Create an account with the [National Science Foundation (NSF)](https://www.nsf.gov/)'s [ACCESS program](https://access-ci.org/). * Request a computational "allocation" from ACCESS. * Log in to Jetstream2's web portal. @@ -156,16 +155,26 @@ The following commands can be copied and pasted into your shell. This first, complicated sequence sets up the Docker repository on your instance: sudo apt-get install ca-certificates curl + sudo install -m 0755 -d /etc/apt/keyrings + sudo curl -fsSL https://download.docker.com/linux/ubuntu/gpg \ + -o /etc/apt/keyrings/docker.asc + sudo chmod a+r /etc/apt/keyrings/docker.asc + echo \ + "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.asc] \ + https://download.docker.com/linux/ubuntu \ + $(. /etc/os-release && echo "$VERSION_CODENAME") stable" | \ + sudo tee /etc/apt/sources.list.d/docker.list > /dev/null - sudo apt-get update + + sudo apt-get update Now you can simply install the Docker Engine: From b27166c48a7aae60223a074a8289407b4cb35504 Mon Sep 17 00:00:00 2001 From: Ben Trumbore Date: Mon, 20 May 2024 19:53:23 -0400 Subject: [PATCH 06/36] Switch the incorrect markdown to the correct, yet horrible, version --- docs/Users_Guide/matthewjetstream.rst | 146 +++++++++++++------------- 1 file changed, 71 insertions(+), 75 deletions(-) diff --git a/docs/Users_Guide/matthewjetstream.rst b/docs/Users_Guide/matthewjetstream.rst index 0615573..d0aad52 100644 --- a/docs/Users_Guide/matthewjetstream.rst +++ b/docs/Users_Guide/matthewjetstream.rst @@ -1,15 +1,13 @@ -******************************************************* Running I-WRF On Jetstream2 with Hurricane Matthew Data -******************************************************* +**************************************************************** -======== Overview -======== +================= The following instructions can be used to run -the `I-WRF weather simulation program https://i-wrf.org title="I-WRF weather simulation program` -with data from `Hurricane Matthew ` -on the Jetstream2 cloud computing platform (https://jetstream-cloud.org/). +the `I-WRF weather simulation program `_ +with data from `Hurricane Matthew `_ +on the `Jetstream2 cloud computing platform `_. This exercise provides an introduction to using cloud computing platforms, running computationally complex simulations and using containerized applications. @@ -21,101 +19,104 @@ and is available to researchers and educators. This example delivers the I-WRF program as a Docker "image", simplifying the set-up for running the simulation. -========================= Prepare to Use Jetstream2 -========================= +=============================== -To [get started with Jetstream2](https://jetstream-cloud.org/get-started), +To `get started with Jetstream2`_, you will need to: - * Create an account with the [National Science Foundation (NSF)](https://www.nsf.gov/)'s [ACCESS program](https://access-ci.org/). - * Request a computational "allocation" from ACCESS. - * Log in to Jetstream2's web portal. +* Create an account with the `National Science Foundation (NSF)`_'s `ACCESS program`_. +* Request a computational "allocation" from ACCESS. +* Log in to Jetstream2's web portal. The sections below will guide you through this process ------------------------- Create an ACCESS Account ------------------------- +-------------------------------- -If you do not already have one, [register for an ACCESS account](https://operations.access-ci.org/identity/new-user). +If you do not already have one, `register for an ACCESS account`_. Note that you can either choose to use an existing University/Organizational account or create an entirely new ACCESS account when registering. ---- Get an Allocation ---- +------------------- -With your ACCESS account set up, you may [request an allocation](https://allocations.access-ci.org/get-your-first-project) +With your ACCESS account set up, you may `request an allocation`_ that will allow you to use an ACCESS-affiliated cyberinfrastructure resource. Be sure to read all of the information on that page so that you make a suitable request. An "Explore" project will be sufficient to work with this example, and you will want to work with the resource "Indiana Jetstream2 CPU" (*not* GPU). The typical turnaround time for allocation requests is one business day. -#### Log in to the Exosphere Web Site +Log in to the Exosphere Web Site +------------------------------------ Once you have an ACCESS account and allocation, -you can log in to their [Exosphere web dashboard](https://jetstream2.exosphere.app/). +you can log in to their `Exosphere web dashboard`_. The process of identifying your allocation and ACCESS ID to use Jetstream2 -is described on [this page](https://cvw.cac.cornell.edu/jetstream/intro/jetstream-login) of the -[Introduction to Jetstream2](https://cvw.cac.cornell.edu/jetstream) Cornell Virtual Workshop, -and on [this page](https://docs.jetstream-cloud.org/ui/exo/login/) -of the [Jetstream2 documentation](https://docs.jetstream-cloud.org/). +is described on `this page`_ of the +`Introduction to Jetstream2`_ Cornell Virtual Workshop, +and on `this page`_ +of the `Jetstream2 documentation`_. While adding an allocation to your account, it is recommended that you choose the "Indiana University" region of Jetstream2 for completing this example. -### Create a Cloud Instance and Log In +Create a Cloud Instance and Log In +==================================== After you have logged in to Jetstream2 and added your allocation to your account, you are ready to create the cloud instance where you will run the I-WRF simulation. If you are not familiar with the cloud computing terms "image" and "instance", -it is recommended that you [read about them](https://cvw.cac.cornell.edu/jetstream/intro/imagesandinstances) +it is recommended that you `read about them`_ before proceeding. -#### Create an SSH Key +Create an SSH Key +------------------- If you are not familiar with "SSH key pairs", you should -[read about them](https://cvw.cac.cornell.edu/jetstream/keys/about-keys) before continuing. +`read about them`_ before continuing. A key pair is needed when creating your instance so that you can log in to it, as password-based log-ins are disabled on Jetstream2. -+ First, [create an SSH Key on your computer](https://cvw.cac.cornell.edu/jetstream/keys/ssh-create) using the "ssh-keygen" command. -+ Then [upload the key to Jetstream2](https://cvw.cac.cornell.edu/jetstream/keys/ssh-upload) through the Exosphere web interface. +* First, `create an SSH Key on your computer`_ using the "ssh-keygen" command. +* Then `upload the key to Jetstream2`_ through the Exosphere web interface. -#### Create an Instance +Create an Instance +--------------------- -The Cornell Virtual Workshop topic [Creating an Instance](https://cvw.cac.cornell.edu/jetstream/create-instance) +The Cornell Virtual Workshop topic `Creating an Instance`_ provides detailed information about creating a Jetstream2 instance. While following those steps, be sure to make the following choices for this instance: -+ Choose the Featured-Ubuntu22 image as the instance source. -+ Choose the "Flavor" m3.quad (4 CPUs) to provide faster a simulation run-time. -+ Select a custom disk size of 100 GB to hold this example's data and results. -+ Select "Yes" for Enable web desktop. -+ Select the SSH public key that you uploaded previously. -+ You do not need to set any of the Advanced Options. +* Choose the Featured-Ubuntu22 image as the instance source. +* Choose the "Flavor" m3.quad (4 CPUs) to provide faster a simulation run-time. +* Select a custom disk size of 100 GB to hold this example's data and results. +* Select "Yes" for Enable web desktop. +* Select the SSH public key that you uploaded previously. +* You do not need to set any of the Advanced Options. After clicking the "Create" button, wait for the instance to enter the "Ready" state (it takes several minutes). Note that the instance will not only be created, but will be running so that you can log in right away. -#### Log in to the Instance +Log in to the Instance +----------------------------- The Exosphere web dashboard provides two easy ways to log in to Jetstream2 instances Web Shell and Web Desktop. -For this example, you can use the [Web Shell](https://cvw.cac.cornell.edu/jetstream/instance-login/webshell) option +For this example, you can use the `Web Shell`_ option to open a terminal tab in your web browser. -You may also want to read about the [features of Guacamole](https://cvw.cac.cornell.edu/jetstream/instance-login/guacamole), +You may also want to read about the `features of Guacamole`_, which is the platform that supports both Web Shell and Web Desktop. Once you are logged in to the web shell you can proceed to the "Install Software and Download Data" section below. -#### Managing a Jetstream2 Instance +Managing a Jetstream2 Instance +------------------------------------ An appropriate aspect of efficient cloud computing is knowing how to -[manage your instances](https://cvw.cac.cornell.edu/jetstream/manage-instance/states-actions). +`manage your instances`_. Instances incur costs whenever they are running (on Jetstream, this is when they are "Ready"). "Shelving" an instance stops it from using the cloud's CPUs and memory, and therefore stops it from incurring any charges on your allocation. @@ -133,7 +134,8 @@ Increasing the number of CPUs (say to flavor "m3.8") can make your computations But of course, doubling the number of CPUs doubles the cost per hour to run the instance, so Shelving as soon as you are done becomes even more important. -### Install Software and Download Data +Install Software and Download Data +===================================== With your instance created and running and you logged in to it through a Web Shell, you can now install the necessary software and download the data to run the simulation. @@ -141,89 +143,83 @@ You will only need to perform these steps once, as they essentially change the contents of the instance's disk and those changes will remain even after the instance is shelved and unshelved. -#### Install Docker and Get the I-WRF Image +Install Docker and Get the I-WRF Image +----------------------------------------- As mentioned above, the I-WRF simulation application is available as an image that will run as a -[Docker "container"](https://docs.docker.com/guides/docker-concepts/the-basics/what-is-a-container/) +`Docker "container"`_ on your instance. To do so, you must first install the Docker Engine on the instance and then download, or "pull" the I-WRF image that will be run as a container in Docker. -The [instructions for installing Docker Engine on Ubuntu](https://docs.docker.com/engine/install/ubuntu/) +The `instructions for installing Docker Engine on Ubuntu`_ are very thorough and make a good reference, but we only need to perform a subset of those steps. The following commands can be copied and pasted into your shell. -This first, complicated sequence sets up the Docker repository on your instance: +This first, complicated sequence sets up the Docker repository on your instance:: sudo apt-get install ca-certificates curl - sudo install -m 0755 -d /etc/apt/keyrings - sudo curl -fsSL https://download.docker.com/linux/ubuntu/gpg \ - -o /etc/apt/keyrings/docker.asc - sudo chmod a+r /etc/apt/keyrings/docker.asc - echo \ - "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.asc] \ - https://download.docker.com/linux/ubuntu \ - $(. /etc/os-release && echo "$VERSION_CODENAME") stable" | \ - sudo tee /etc/apt/sources.list.d/docker.list > /dev/null + sudo apt-get update - sudo apt-get update - -Now you can simply install the Docker Engine: +Now you can simply install the Docker Engine:: sudo apt-get install docker-ce docker-ce-cli -And finally, you pull the latest version of the I-WRF image onto your instance: +And finally, you pull the latest version of the I-WRF image onto your instance:: docker pull ncar/iwrf -#### Get the Geographic Data +Get the Geographic Data +---------------------------- To run I-WRF on the Hurricane Matthew data set, you need a copy of the geographic data representing the terrain in the area of the simulation. These commands download an archive file containing that data, -uncompress the archive into a folder named "WPS_GEOG", and delete the archive file. +uncompress the archive into a folder named "WPS_GEOG", and delete the archive file.:: - wget https://www2.mmm.ucar.edu/wrf/src/wps_files/geog_high_res_mandatory.tar.gz - tar -xzf geog_high_res_mandatory.tar.gz + wget https://www2.mmm.ucar.edu/wrf/src/wps_files/geog_high_res_mandatory.tar.gz + tar -xzf geog_high_res_mandatory.tar.gz rm geog_high_res_mandatory.tar.tz -#### Create the Run Folder +Create the Run Folder +------------------------- The simulation is started by a script that must first be downloaded. The script expects to run in a folder where it can download data files and generate results. In this example, we expect this folder to be named "matthew" and to be in the user's home directory. The script is called "run.sh". The following commands create the empty folder and download the script into it, -and they can be copied and pasted into your web shell. +and they can be copied and pasted into your web shell.:: mkdir matthew https://gist.githubusercontent.com/Trumbore/27cef8073048cde7a8142af9bfb0b264/raw/1115ce9de4a30ad665055ed323c40a4e7aa411b2/run.sh > matthew/run.sh -### Run I-WRF +Run I-WRF +=========== With everything in place, you are now ready to run the Docker container that will perform the simulation. The downloaded script runs inside the container, prints lots of status information, and creates output files in the run folder you created. -Copy and paste this command into your web shell: +Copy and paste this command into your web shell:: time docker run --shm-size 14G -it -v ~/:/home/wrfuser/terrestrial_data \ -v ~/matthew:/tmp/hurricane_matthew ncar/iwrf:latest /tmp/hurricane_matthew/run.sh The command has numerous arguments and options, which do the following: -+ `time docker run` prints the runtime of the "docker run" command. -+ `--shm-size 14G -it` tells the command how much shared memory to use, and to run interactively in the shell. -+ The `-v` options map folders in the instance to paths within the contianer. -+ `ncar/iwrf:latest` is the Docker image to use when creating the container. -+ `/tmp/hurricane_matthew/run.sh` is the location within the container of the script that it runs. +* ``time docker run`` prints the runtime of the "docker run" command. +* ``--shm-size 14G -it`` tells the command how much shared memory to use, and to run interactively in the shell. +* The ``-v`` options map folders in the instance to paths within the contianer. +* ``ncar/iwrf:latest`` is the Docker image to use when creating the container. +* ``/tmp/hurricane_matthew/run.sh`` is the location within the container of the script that it runs. It takes about 12 minutes for the simulation to finish on an m3.quad Jetstream instance. From 91085597a53b2c49b86aa02408acae47198bb5dc Mon Sep 17 00:00:00 2001 From: Ben Trumbore Date: Tue, 21 May 2024 11:24:44 -0400 Subject: [PATCH 07/36] Edit text, fix typos and links --- docs/Users_Guide/matthewjetstream.rst | 80 ++++++++++++++------------- 1 file changed, 42 insertions(+), 38 deletions(-) diff --git a/docs/Users_Guide/matthewjetstream.rst b/docs/Users_Guide/matthewjetstream.rst index d0aad52..e1d2c42 100644 --- a/docs/Users_Guide/matthewjetstream.rst +++ b/docs/Users_Guide/matthewjetstream.rst @@ -6,6 +6,7 @@ Overview The following instructions can be used to run the `I-WRF weather simulation program `_ +from the `National Center for Atmospheric Research (NCAR) `_ with data from `Hurricane Matthew `_ on the `Jetstream2 cloud computing platform `_. This exercise provides an introduction to using cloud computing platforms, @@ -16,8 +17,13 @@ than you may have on your personal computer, but a cloud computing platform can provided the needed computational power. Jetstream2 is a national cyberinfrastructure resource that is easy to use and is available to researchers and educators. -This example delivers the I-WRF program as a Docker "image", -simplifying the set-up for running the simulation. +This example runs the I-WRF program as a Docker "container", +which simplifyies the set-up work needed to run the simulation. + +It is recommended that you follow the instructions in each section in the order presented +to avoid encountering issues during the process. +Most sections refer to external documentation to provide details about the necessary steps +and to offer additional background information. Prepare to Use Jetstream2 =============================== @@ -29,13 +35,13 @@ you will need to: * Request a computational "allocation" from ACCESS. * Log in to Jetstream2's web portal. -The sections below will guide you through this process +The sections below will guide you through this process. Create an ACCESS Account -------------------------------- If you do not already have one, `register for an ACCESS account`_. -Note that you can either choose to use an existing University/Organizational account or +Note that you can either choose to associate your existing University/Organizational account or create an entirely new ACCESS account when registering. Get an Allocation @@ -52,12 +58,12 @@ Log in to the Exosphere Web Site ------------------------------------ Once you have an ACCESS account and allocation, -you can log in to their `Exosphere web dashboard`_. +you can log in to their `Exosphere web dashboard`_. The process of identifying your allocation and ACCESS ID to use Jetstream2 is described on `this page`_ of the `Introduction to Jetstream2`_ Cornell Virtual Workshop, -and on `this page`_ -of the `Jetstream2 documentation`_. +and on `this page`_ +of the `Jetstream2 documentation`_. While adding an allocation to your account, it is recommended that you choose the "Indiana University" region of Jetstream2 for completing this example. @@ -68,19 +74,19 @@ Create a Cloud Instance and Log In After you have logged in to Jetstream2 and added your allocation to your account, you are ready to create the cloud instance where you will run the I-WRF simulation. If you are not familiar with the cloud computing terms "image" and "instance", -it is recommended that you `read about them`_ +it is recommended that you `read about them`_ before proceeding. Create an SSH Key ------------------- If you are not familiar with "SSH key pairs", you should -`read about them`_ before continuing. +`read about them`_ before continuing. A key pair is needed when creating your instance so that you can log in to it, -as password-based log-ins are disabled on Jetstream2. +as password-based logins are disabled on Jetstream2. * First, `create an SSH Key on your computer`_ using the "ssh-keygen" command. -* Then `upload the key to Jetstream2`_ through the Exosphere web interface. +* Then `upload the key to Jetstream2`_ through the Exosphere web interface. Create an Instance --------------------- @@ -89,10 +95,9 @@ The Cornell Virtual Workshop topic `Creating an Instance`_ option to open a terminal tab in your web browser. You may also want to read about the `features of Guacamole`_, @@ -115,24 +119,24 @@ Once you are logged in to the web shell you can proceed to the Managing a Jetstream2 Instance ------------------------------------ -An appropriate aspect of efficient cloud computing is knowing how to +In order to use cloud computing resources efficiently, you must know how to `manage your instances`_. -Instances incur costs whenever they are running (on Jetstream, this is when they are "Ready"). +Instances incur costs whenever they are running (on Jetstream2, this is when they are "Ready"). "Shelving" an instance stops it from using the cloud's CPUs and memory, -and therefore stops it from incurring any charges on your allocation. +and therefore stops it from incurring any charges against your allocation. When you are through working on this example, be sure to use the instance's "Actions" menu in the web dashboard to "Shelve" the instance so that it is no longer spending your credits. -If you alter return to the dashboard and want to use the instance again, +If you later return to the dashboard and want to use the instance again, Use the Action menu's "Unshelve" option to start the instance up again. Note that any programs that were running when you shelve the instance will be lost, but the contents of the disk are preserved when shelving. You may also want to try the "Resize" action to change the number of CPUs of the instance. -Increasing the number of CPUs (say to flavor "m3.8") can make your computations finish more quickly. +Increasing the number of CPUs (say, to flavor "m3.8") can make your computations finish more quickly. But of course, doubling the number of CPUs doubles the cost per hour to run the instance, -so Shelving as soon as you are done becomes even more important. +so Shelving as soon as you are done becomes even more important! Install Software and Download Data ===================================== @@ -146,11 +150,11 @@ and those changes will remain even after the instance is shelved and unshelved. Install Docker and Get the I-WRF Image ----------------------------------------- -As mentioned above, the I-WRF simulation application is available as an image that will run as a -`Docker "container"`_ -on your instance. -To do so, you must first install the Docker Engine on the instance -and then download, or "pull" the I-WRF image that will be run as a container in Docker. +As mentioned above, the I-WRF simulation application is provided as a Docker image that will run as a +`"container"`_ +on your cloud instance. +To run a Docker container, you must first install the Docker Engine on your instance. +You can then "pull" (download) the I-WRF image that will be run as a container. The `instructions for installing Docker Engine on Ubuntu`_ are very thorough and make a good reference, but we only need to perform a subset of those steps. @@ -169,11 +173,11 @@ This first, complicated sequence sets up the Docker repository on your instance: sudo tee /etc/apt/sources.list.d/docker.list > /dev/null sudo apt-get update -Now you can simply install the Docker Engine:: +Then you will install the Docker Engine:: sudo apt-get install docker-ce docker-ce-cli -And finally, you pull the latest version of the I-WRF image onto your instance:: +And finally, pull the latest version of the I-WRF image onto your instance:: docker pull ncar/iwrf @@ -183,7 +187,7 @@ Get the Geographic Data To run I-WRF on the Hurricane Matthew data set, you need a copy of the geographic data representing the terrain in the area of the simulation. These commands download an archive file containing that data, -uncompress the archive into a folder named "WPS_GEOG", and delete the archive file.:: +uncompress the archive into a folder named "WPS_GEOG", and delete the archive file:: wget https://www2.mmm.ucar.edu/wrf/src/wps_files/geog_high_res_mandatory.tar.gz tar -xzf geog_high_res_mandatory.tar.gz @@ -192,12 +196,12 @@ uncompress the archive into a folder named "WPS_GEOG", and delete the archive fi Create the Run Folder ------------------------- -The simulation is started by a script that must first be downloaded. -The script expects to run in a folder where it can download data files and generate results. -In this example, we expect this folder to be named "matthew" and to be in the user's home directory. -The script is called "run.sh". +The simulation is performed using a script that must first be downloaded. +The script expects to run in a folder where it can download data files and create result files. +The instructions in this exercise create that folder in the user's home directory and name it "matthew". +The simulation script is called "run.sh". The following commands create the empty folder and download the script into it, -and they can be copied and pasted into your web shell.:: +and they can be copied and pasted into your web shell:: mkdir matthew https://gist.githubusercontent.com/Trumbore/27cef8073048cde7a8142af9bfb0b264/raw/1115ce9de4a30ad665055ed323c40a4e7aa411b2/run.sh > matthew/run.sh @@ -208,7 +212,7 @@ Run I-WRF With everything in place, you are now ready to run the Docker container that will perform the simulation. The downloaded script runs inside the container, prints lots of status information, and creates output files in the run folder you created. -Copy and paste this command into your web shell:: +To run the simulation, copy and paste this command into your web shell:: time docker run --shm-size 14G -it -v ~/:/home/wrfuser/terrestrial_data \ -v ~/matthew:/tmp/hurricane_matthew ncar/iwrf:latest /tmp/hurricane_matthew/run.sh @@ -217,7 +221,7 @@ The command has numerous arguments and options, which do the following: * ``time docker run`` prints the runtime of the "docker run" command. * ``--shm-size 14G -it`` tells the command how much shared memory to use, and to run interactively in the shell. -* The ``-v`` options map folders in the instance to paths within the contianer. +* The ``-v`` options map folders in your cloud instance to paths within the contianer. * ``ncar/iwrf:latest`` is the Docker image to use when creating the container. * ``/tmp/hurricane_matthew/run.sh`` is the location within the container of the script that it runs. From aa27524b84d5546460b5eee1958f2ef391f58cf5 Mon Sep 17 00:00:00 2001 From: Ben Trumbore Date: Wed, 22 May 2024 09:59:45 -0400 Subject: [PATCH 08/36] Make sure all commands work properly, add more info. --- docs/Users_Guide/matthewjetstream.rst | 27 +++++++++++++-------------- 1 file changed, 13 insertions(+), 14 deletions(-) diff --git a/docs/Users_Guide/matthewjetstream.rst b/docs/Users_Guide/matthewjetstream.rst index e1d2c42..0a1c298 100644 --- a/docs/Users_Guide/matthewjetstream.rst +++ b/docs/Users_Guide/matthewjetstream.rst @@ -147,6 +147,10 @@ You will only need to perform these steps once, as they essentially change the contents of the instance's disk and those changes will remain even after the instance is shelved and unshelved. +The following sections instruct you to issue numerous Linux commands in your web shell. +If you are not familiar with Linux, you may want to want to refer to +`An Introduction to Linux `_ when working through these steps. + Install Docker and Get the I-WRF Image ----------------------------------------- @@ -161,21 +165,15 @@ are very thorough and make a good reference, but we only need to perform a subse The following commands can be copied and pasted into your shell. This first, complicated sequence sets up the Docker repository on your instance:: - sudo apt-get install ca-certificates curl sudo install -m 0755 -d /etc/apt/keyrings - sudo curl -fsSL https://download.docker.com/linux/ubuntu/gpg \ - -o /etc/apt/keyrings/docker.asc + sudo curl -fsSL https://download.docker.com/linux/ubuntu/gpg -o /etc/apt/keyrings/docker.asc sudo chmod a+r /etc/apt/keyrings/docker.asc - echo \ - "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.asc] \ - https://download.docker.com/linux/ubuntu \ - $(. /etc/os-release && echo "$VERSION_CODENAME") stable" | \ - sudo tee /etc/apt/sources.list.d/docker.list > /dev/null - sudo apt-get update + echo "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.asc] https://download.docker.com/linux/ubuntu $(. /etc/os-release && echo "$VERSION_CODENAME") stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null + sudo apt-get -y update Then you will install the Docker Engine:: - sudo apt-get install docker-ce docker-ce-cli + sudo apt-get -y install docker-ce docker-ce-cli And finally, pull the latest version of the I-WRF image onto your instance:: @@ -187,7 +185,8 @@ Get the Geographic Data To run I-WRF on the Hurricane Matthew data set, you need a copy of the geographic data representing the terrain in the area of the simulation. These commands download an archive file containing that data, -uncompress the archive into a folder named "WPS_GEOG", and delete the archive file:: +uncompress the archive into a folder named "WPS_GEOG", and delete the archive file. +They take several minutes to complete:: wget https://www2.mmm.ucar.edu/wrf/src/wps_files/geog_high_res_mandatory.tar.gz tar -xzf geog_high_res_mandatory.tar.gz @@ -204,7 +203,8 @@ The following commands create the empty folder and download the script into it, and they can be copied and pasted into your web shell:: mkdir matthew - https://gist.githubusercontent.com/Trumbore/27cef8073048cde7a8142af9bfb0b264/raw/1115ce9de4a30ad665055ed323c40a4e7aa411b2/run.sh > matthew/run.sh + curl https://gist.githubusercontent.com/Trumbore/27cef8073048cde7a8142af9bfb0b264/raw/1115ce9de4a30ad665055ed323c40a4e7aa411b2/run.sh > matthew/run.sh + chmod 775 matthew/run.sh Run I-WRF =========== @@ -214,8 +214,7 @@ The downloaded script runs inside the container, prints lots of status informati and creates output files in the run folder you created. To run the simulation, copy and paste this command into your web shell:: - time docker run --shm-size 14G -it -v ~/:/home/wrfuser/terrestrial_data \ - -v ~/matthew:/tmp/hurricane_matthew ncar/iwrf:latest /tmp/hurricane_matthew/run.sh + time docker run --shm-size 14G -it -v ~/:/home/wrfuser/terrestrial_data -v ~/matthew:/tmp/hurricane_matthew ncar/iwrf:latest /tmp/hurricane_matthew/run.sh The command has numerous arguments and options, which do the following: From 01b5d1a4fd157915293fd212830cca512ad92b92 Mon Sep 17 00:00:00 2001 From: Ben Trumbore Date: Wed, 22 May 2024 11:21:57 -0400 Subject: [PATCH 09/36] A few more tweaks before others view it --- docs/Users_Guide/matthewjetstream.rst | 16 +++++++++++----- 1 file changed, 11 insertions(+), 5 deletions(-) diff --git a/docs/Users_Guide/matthewjetstream.rst b/docs/Users_Guide/matthewjetstream.rst index 0a1c298..d890f39 100644 --- a/docs/Users_Guide/matthewjetstream.rst +++ b/docs/Users_Guide/matthewjetstream.rst @@ -150,6 +150,13 @@ and those changes will remain even after the instance is shelved and unshelved. The following sections instruct you to issue numerous Linux commands in your web shell. If you are not familiar with Linux, you may want to want to refer to `An Introduction to Linux `_ when working through these steps. +The commands in each section can be copied using the button in the upper right corner +and then pasted into your web shell by right-clicking. + +If your web shell ever becomes unresponsive or disconnected from the instance, +you can recover from that situation by rebooting the instance. +In the Exosphere dashboard page for your instance, in the Actions menu, select "Reboot". +The process takes several minutes, after which the instance status will return to "Ready". Install Docker and Get the I-WRF Image ----------------------------------------- @@ -162,8 +169,7 @@ You can then "pull" (download) the I-WRF image that will be run as a container. The `instructions for installing Docker Engine on Ubuntu`_ are very thorough and make a good reference, but we only need to perform a subset of those steps. -The following commands can be copied and pasted into your shell. -This first, complicated sequence sets up the Docker repository on your instance:: +This first sequence sets up the Docker software repository on your instance:: sudo install -m 0755 -d /etc/apt/keyrings sudo curl -fsSL https://download.docker.com/linux/ubuntu/gpg -o /etc/apt/keyrings/docker.asc @@ -171,7 +177,7 @@ This first, complicated sequence sets up the Docker repository on your instance: echo "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.asc] https://download.docker.com/linux/ubuntu $(. /etc/os-release && echo "$VERSION_CODENAME") stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null sudo apt-get -y update -Then you will install the Docker Engine:: +Then you will install the Docker Engine from that repository:: sudo apt-get -y install docker-ce docker-ce-cli @@ -200,7 +206,7 @@ The script expects to run in a folder where it can download data files and creat The instructions in this exercise create that folder in the user's home directory and name it "matthew". The simulation script is called "run.sh". The following commands create the empty folder and download the script into it, -and they can be copied and pasted into your web shell:: +then change its permissions so it can be run:: mkdir matthew curl https://gist.githubusercontent.com/Trumbore/27cef8073048cde7a8142af9bfb0b264/raw/1115ce9de4a30ad665055ed323c40a4e7aa411b2/run.sh > matthew/run.sh @@ -212,7 +218,7 @@ Run I-WRF With everything in place, you are now ready to run the Docker container that will perform the simulation. The downloaded script runs inside the container, prints lots of status information, and creates output files in the run folder you created. -To run the simulation, copy and paste this command into your web shell:: +Execute this command to run the simulation in your web shell:: time docker run --shm-size 14G -it -v ~/:/home/wrfuser/terrestrial_data -v ~/matthew:/tmp/hurricane_matthew ncar/iwrf:latest /tmp/hurricane_matthew/run.sh From 3d47fc27a3836927b30141f875847f3246850870 Mon Sep 17 00:00:00 2001 From: Ben Trumbore Date: Tue, 28 May 2024 14:44:07 -0400 Subject: [PATCH 10/36] Revisions based on feedback and some procedural changes --- docs/Users_Guide/matthewjetstream.rst | 59 +++++++++++++++------------ 1 file changed, 33 insertions(+), 26 deletions(-) diff --git a/docs/Users_Guide/matthewjetstream.rst b/docs/Users_Guide/matthewjetstream.rst index d890f39..570125c 100644 --- a/docs/Users_Guide/matthewjetstream.rst +++ b/docs/Users_Guide/matthewjetstream.rst @@ -17,7 +17,7 @@ than you may have on your personal computer, but a cloud computing platform can provided the needed computational power. Jetstream2 is a national cyberinfrastructure resource that is easy to use and is available to researchers and educators. -This example runs the I-WRF program as a Docker "container", +This exercise runs the I-WRF program as a Docker "container", which simplifyies the set-up work needed to run the simulation. It is recommended that you follow the instructions in each section in the order presented @@ -50,7 +50,7 @@ Get an Allocation With your ACCESS account set up, you may `request an allocation`_ that will allow you to use an ACCESS-affiliated cyberinfrastructure resource. Be sure to read all of the information on that page so that you make a suitable request. -An "Explore" project will be sufficient to work with this example, +An "Explore" project will be sufficient to work with this exercise, and you will want to work with the resource "Indiana Jetstream2 CPU" (*not* GPU). The typical turnaround time for allocation requests is one business day. @@ -66,7 +66,7 @@ and on `this page`_ of the `Jetstream2 documentation`_. While adding an allocation to your account, it is recommended that you choose -the "Indiana University" region of Jetstream2 for completing this example. +the "Indiana University" region of Jetstream2 for completing this exercise. Create a Cloud Instance and Log In ==================================== @@ -80,13 +80,13 @@ before proceeding. Create an SSH Key ------------------- +You must upload a key pair to Jetstream before creating your instance so that you can log in to it, +as password-based logins are disabled on Jetstream2. If you are not familiar with "SSH key pairs", you should `read about them`_ before continuing. -A key pair is needed when creating your instance so that you can log in to it, -as password-based logins are disabled on Jetstream2. -* First, `create an SSH Key on your computer`_ using the "ssh-keygen" command. -* Then `upload the key to Jetstream2`_ through the Exosphere web interface. +* First, `create an SSH Key on your computer`_ using the "ssh-keygen" command. That command allows you to specify the name and location of the private key file it creates, with the default being "id_rsa". The matching public key file is saved to the same location and name with ".pub" appended to the filename. Later instructions will assume that your private key file is named "id_rsa", but you may choose a different name now and use that name in those later instructions. +* Then, `upload the public key to Jetstream2`_ through the Exosphere web interface. Create an Instance --------------------- @@ -95,7 +95,7 @@ The Cornell Virtual Workshop topic `Creating an Instance`_ option -to open a terminal tab in your web browser. -You may also want to read about the `features of Guacamole`_, -which is the platform that supports both Web Shell and Web Desktop. +The Exosphere web dashboard provides the easy-to-use Web Shell for accessing your Jetstream2 instances, +but after encountering some issues with this exercise when using Web Shell, +we are recommending that you use the SSH command to access your instance from a shell on your computer. +The instructions for `connecting to Jetstream2 using SSH`_ +can executed in the Command Prompt on Windows (from the Start menu, type "cmd" and select Command Prompt) +or from the Terminal application on a Mac. + +In either case you will need to know the location and name of the private SSH key created on your computer (see above), +the IP address of your instance (found in the Exosphere web dashboard) +and the default username on your instance, which is "exouser". Once you are logged in to the web shell you can proceed to the "Install Software and Download Data" section below. +You will know that your login has been successful when the prompt has the form ``exouser@instance-name:~$``, +which indicates your username, the instance name, and your current working directory, followed by "$" Managing a Jetstream2 Instance ------------------------------------ @@ -125,7 +132,7 @@ Instances incur costs whenever they are running (on Jetstream2, this is when the "Shelving" an instance stops it from using the cloud's CPUs and memory, and therefore stops it from incurring any charges against your allocation. -When you are through working on this example, +When you are through working on this exercise, be sure to use the instance's "Actions" menu in the web dashboard to "Shelve" the instance so that it is no longer spending your credits. If you later return to the dashboard and want to use the instance again, @@ -169,19 +176,19 @@ You can then "pull" (download) the I-WRF image that will be run as a container. The `instructions for installing Docker Engine on Ubuntu`_ are very thorough and make a good reference, but we only need to perform a subset of those steps. -This first sequence sets up the Docker software repository on your instance:: +These commands run a script that sets up the Docker software repository on your instance, +then installs Docker and starts its daemon:: - sudo install -m 0755 -d /etc/apt/keyrings - sudo curl -fsSL https://download.docker.com/linux/ubuntu/gpg -o /etc/apt/keyrings/docker.asc - sudo chmod a+r /etc/apt/keyrings/docker.asc - echo "deb [arch=$(dpkg --print-architecture) signed-by=/etc/apt/keyrings/docker.asc] https://download.docker.com/linux/ubuntu $(. /etc/os-release && echo "$VERSION_CODENAME") stable" | sudo tee /etc/apt/sources.list.d/docker.list > /dev/null - sudo apt-get -y update + wget https://bit.ly/iwrf-docker > install-docker.sh + source install-docker.sh -Then you will install the Docker Engine from that repository:: +You can test whether the Docker command line tool was installed correctly by asking for its version, +and ask for the status of the Docker daemon to make sure it is running:: - sudo apt-get -y install docker-ce docker-ce-cli + docker --version + sudo systemctl --no-pager status docker -And finally, pull the latest version of the I-WRF image onto your instance:: +Once all of that is in order, pull the latest version of the I-WRF image onto your instance:: docker pull ncar/iwrf @@ -196,7 +203,7 @@ They take several minutes to complete:: wget https://www2.mmm.ucar.edu/wrf/src/wps_files/geog_high_res_mandatory.tar.gz tar -xzf geog_high_res_mandatory.tar.gz - rm geog_high_res_mandatory.tar.tz + rm geog_high_res_mandatory.tar.gz Create the Run Folder ------------------------- @@ -209,7 +216,7 @@ The following commands create the empty folder and download the script into it, then change its permissions so it can be run:: mkdir matthew - curl https://gist.githubusercontent.com/Trumbore/27cef8073048cde7a8142af9bfb0b264/raw/1115ce9de4a30ad665055ed323c40a4e7aa411b2/run.sh > matthew/run.sh + curl https://bit.ly/run-iwrf > matthew/run.sh chmod 775 matthew/run.sh Run I-WRF @@ -230,5 +237,5 @@ The command has numerous arguments and options, which do the following: * ``ncar/iwrf:latest`` is the Docker image to use when creating the container. * ``/tmp/hurricane_matthew/run.sh`` is the location within the container of the script that it runs. -It takes about 12 minutes for the simulation to finish on an m3.quad Jetstream instance. +It takes about three minutes for the simulation to finish on an m3.quad Jetstream instance. From 099d4627de875dc9da62ab4a32cc952b965fe896 Mon Sep 17 00:00:00 2001 From: Ben Trumbore Date: Wed, 29 May 2024 14:34:22 -0400 Subject: [PATCH 11/36] Final edits before initial publication --- docs/Users_Guide/matthewjetstream.rst | 42 +++++++++++++++++++++------ 1 file changed, 33 insertions(+), 9 deletions(-) diff --git a/docs/Users_Guide/matthewjetstream.rst b/docs/Users_Guide/matthewjetstream.rst index 570125c..713c56d 100644 --- a/docs/Users_Guide/matthewjetstream.rst +++ b/docs/Users_Guide/matthewjetstream.rst @@ -80,8 +80,9 @@ before proceeding. Create an SSH Key ------------------- -You must upload a key pair to Jetstream before creating your instance so that you can log in to it, -as password-based logins are disabled on Jetstream2. +You must upload a public SSH key to Jetstream2 before creating your instance. +Jetstream2 injects that public key into the instance's default user account, +and you will need to provide the matching private SSH key to log in to the instance. If you are not familiar with "SSH key pairs", you should `read about them`_ before continuing. @@ -177,18 +178,25 @@ You can then "pull" (download) the I-WRF image that will be run as a container. The `instructions for installing Docker Engine on Ubuntu`_ are very thorough and make a good reference, but we only need to perform a subset of those steps. These commands run a script that sets up the Docker software repository on your instance, -then installs Docker and starts its daemon:: +then installs Docker:: - wget https://bit.ly/iwrf-docker > install-docker.sh + curl --location https://bit.ly/3R3lqMU > install-docker.sh source install-docker.sh -You can test whether the Docker command line tool was installed correctly by asking for its version, -and ask for the status of the Docker daemon to make sure it is running:: +If a text dialog is displayed asking which services should be restarted, type ``Enter``. +When the installation is complete, you can verify that the Docker command line tool works by asking for its version:: docker --version + +Next, you must start the Docker daemon, which runs in the background and processes commands:: + + sudo service docker start + +If that command appeared to succeed, you can confirm its status with this command:: + sudo systemctl --no-pager status docker -Once all of that is in order, pull the latest version of the I-WRF image onto your instance:: +Once all of that is in order, you must pull the latest version of the I-WRF image onto your instance:: docker pull ncar/iwrf @@ -216,7 +224,7 @@ The following commands create the empty folder and download the script into it, then change its permissions so it can be run:: mkdir matthew - curl https://bit.ly/run-iwrf > matthew/run.sh + curl --location https://bit.ly/3KoBtRK > matthew/run.sh chmod 775 matthew/run.sh Run I-WRF @@ -237,5 +245,21 @@ The command has numerous arguments and options, which do the following: * ``ncar/iwrf:latest`` is the Docker image to use when creating the container. * ``/tmp/hurricane_matthew/run.sh`` is the location within the container of the script that it runs. -It takes about three minutes for the simulation to finish on an m3.quad Jetstream instance. +The simulation initially prints lots of information while initializing things, then settles in to the computation. +The provided configuration simulates 12 hours of weather and takes under three minutes to finish on an m3.quad Jetstream2 instance. +Once completed, you can view the end of any of the output files to confirm that it succeeded:: + + tail matthew/rsl.out.0000 + +The output should look something like this:: + Timing for main: time 2016-10-06_11:42:30 on domain 1: 0.23300 elapsed seconds + Timing for main: time 2016-10-06_11:45:00 on domain 1: 0.23366 elapsed seconds + Timing for main: time 2016-10-06_11:47:30 on domain 1: 2.77688 elapsed seconds + Timing for main: time 2016-10-06_11:50:00 on domain 1: 0.23415 elapsed seconds + Timing for main: time 2016-10-06_11:52:30 on domain 1: 0.23260 elapsed seconds + Timing for main: time 2016-10-06_11:55:00 on domain 1: 0.23354 elapsed seconds + Timing for main: time 2016-10-06_11:57:30 on domain 1: 0.23345 elapsed seconds + Timing for main: time 2016-10-06_12:00:00 on domain 1: 0.23407 elapsed seconds + Timing for Writing wrfout_d01_2016-10-06_12:00:00 for domain 1: 0.32534 elapsed seconds + d01 2016-10-06_12:00:00 wrf: SUCCESS COMPLETE WRF From 703af0481110d3c685bcaa5048dd50aa7012f371 Mon Sep 17 00:00:00 2001 From: George McCabe <23407799+georgemccabe@users.noreply.github.com> Date: Wed, 29 May 2024 13:29:58 -0600 Subject: [PATCH 12/36] fixed URL links, fixed header underline lengths, added link to matthew jetstream page from use cases page --- docs/Users_Guide/matthewjetstream.rst | 69 ++++++++++++++------------- docs/Users_Guide/usecases.rst | 6 ++- 2 files changed, 39 insertions(+), 36 deletions(-) diff --git a/docs/Users_Guide/matthewjetstream.rst b/docs/Users_Guide/matthewjetstream.rst index 713c56d..3a1a474 100644 --- a/docs/Users_Guide/matthewjetstream.rst +++ b/docs/Users_Guide/matthewjetstream.rst @@ -1,8 +1,8 @@ Running I-WRF On Jetstream2 with Hurricane Matthew Data -**************************************************************** +******************************************************* Overview -================= +======== The following instructions can be used to run the `I-WRF weather simulation program `_ @@ -26,28 +26,28 @@ Most sections refer to external documentation to provide details about the neces and to offer additional background information. Prepare to Use Jetstream2 -=============================== +========================= -To `get started with Jetstream2`_, +To `get started with Jetstream2 `_, you will need to: -* Create an account with the `National Science Foundation (NSF)`_'s `ACCESS program`_. +* Create an account with the `National Science Foundation (NSF) `_'s `ACCESS program `_. * Request a computational "allocation" from ACCESS. * Log in to Jetstream2's web portal. The sections below will guide you through this process. Create an ACCESS Account --------------------------------- +------------------------ -If you do not already have one, `register for an ACCESS account`_. +If you do not already have one, `register for an ACCESS account `_. Note that you can either choose to associate your existing University/Organizational account or create an entirely new ACCESS account when registering. Get an Allocation -------------------- +----------------- -With your ACCESS account set up, you may `request an allocation`_ +With your ACCESS account set up, you may `request an allocation `_ that will allow you to use an ACCESS-affiliated cyberinfrastructure resource. Be sure to read all of the information on that page so that you make a suitable request. An "Explore" project will be sufficient to work with this exercise, @@ -55,44 +55,44 @@ and you will want to work with the resource "Indiana Jetstream2 CPU" (*not* GPU) The typical turnaround time for allocation requests is one business day. Log in to the Exosphere Web Site ------------------------------------- +-------------------------------- Once you have an ACCESS account and allocation, -you can log in to their `Exosphere web dashboard`_. +you can log in to their `Exosphere web dashboard `_. The process of identifying your allocation and ACCESS ID to use Jetstream2 -is described on `this page`_ of the -`Introduction to Jetstream2`_ Cornell Virtual Workshop, -and on `this page`_ -of the `Jetstream2 documentation`_. +is described on `this page `_ of the +`Introduction to Jetstream2 `_ Cornell Virtual Workshop, +and on `this page `_ +of the `Jetstream2 documentation `_. While adding an allocation to your account, it is recommended that you choose the "Indiana University" region of Jetstream2 for completing this exercise. Create a Cloud Instance and Log In -==================================== +================================== After you have logged in to Jetstream2 and added your allocation to your account, you are ready to create the cloud instance where you will run the I-WRF simulation. If you are not familiar with the cloud computing terms "image" and "instance", -it is recommended that you `read about them`_ +it is recommended that you `read about them `_ before proceeding. Create an SSH Key -------------------- +----------------- You must upload a public SSH key to Jetstream2 before creating your instance. Jetstream2 injects that public key into the instance's default user account, and you will need to provide the matching private SSH key to log in to the instance. If you are not familiar with "SSH key pairs", you should -`read about them`_ before continuing. +`read about them `_ before continuing. -* First, `create an SSH Key on your computer`_ using the "ssh-keygen" command. That command allows you to specify the name and location of the private key file it creates, with the default being "id_rsa". The matching public key file is saved to the same location and name with ".pub" appended to the filename. Later instructions will assume that your private key file is named "id_rsa", but you may choose a different name now and use that name in those later instructions. -* Then, `upload the public key to Jetstream2`_ through the Exosphere web interface. +* First, `create an SSH Key on your computer `_ using the "ssh-keygen" command. That command allows you to specify the name and location of the private key file it creates, with the default being "id_rsa". The matching public key file is saved to the same location and name with ".pub" appended to the filename. Later instructions will assume that your private key file is named "id_rsa", but you may choose a different name now and use that name in those later instructions. +* Then, `upload the public key to Jetstream2 `_ through the Exosphere web interface. Create an Instance ---------------------- +------------------ -The Cornell Virtual Workshop topic `Creating an Instance`_ +The Cornell Virtual Workshop topic `Creating an Instance `_ provides detailed information about creating a Jetstream2 instance. While following those steps, be sure to make the following choices for this instance: @@ -106,12 +106,12 @@ After clicking the "Create" button, wait for the instance to enter the "Ready" s Note that the instance will not only be created, but will be running so that you can log in right away. Log in to the Instance ------------------------------ +---------------------- The Exosphere web dashboard provides the easy-to-use Web Shell for accessing your Jetstream2 instances, but after encountering some issues with this exercise when using Web Shell, we are recommending that you use the SSH command to access your instance from a shell on your computer. -The instructions for `connecting to Jetstream2 using SSH`_ +The instructions for `connecting to Jetstream2 using SSH `_ can executed in the Command Prompt on Windows (from the Start menu, type "cmd" and select Command Prompt) or from the Terminal application on a Mac. @@ -125,10 +125,10 @@ You will know that your login has been successful when the prompt has the form ` which indicates your username, the instance name, and your current working directory, followed by "$" Managing a Jetstream2 Instance ------------------------------------- +------------------------------ In order to use cloud computing resources efficiently, you must know how to -`manage your instances`_. +`manage your instances `_. Instances incur costs whenever they are running (on Jetstream2, this is when they are "Ready"). "Shelving" an instance stops it from using the cloud's CPUs and memory, and therefore stops it from incurring any charges against your allocation. @@ -147,7 +147,7 @@ But of course, doubling the number of CPUs doubles the cost per hour to run the so Shelving as soon as you are done becomes even more important! Install Software and Download Data -===================================== +================================== With your instance created and running and you logged in to it through a Web Shell, you can now install the necessary software and download the data to run the simulation. @@ -167,15 +167,15 @@ In the Exosphere dashboard page for your instance, in the Actions menu, select " The process takes several minutes, after which the instance status will return to "Ready". Install Docker and Get the I-WRF Image ------------------------------------------ +-------------------------------------- As mentioned above, the I-WRF simulation application is provided as a Docker image that will run as a -`"container"`_ +`"container" `_ on your cloud instance. To run a Docker container, you must first install the Docker Engine on your instance. You can then "pull" (download) the I-WRF image that will be run as a container. -The `instructions for installing Docker Engine on Ubuntu`_ +The `instructions for installing Docker Engine on Ubuntu `_ are very thorough and make a good reference, but we only need to perform a subset of those steps. These commands run a script that sets up the Docker software repository on your instance, then installs Docker:: @@ -201,7 +201,7 @@ Once all of that is in order, you must pull the latest version of the I-WRF imag docker pull ncar/iwrf Get the Geographic Data ----------------------------- +----------------------- To run I-WRF on the Hurricane Matthew data set, you need a copy of the geographic data representing the terrain in the area of the simulation. @@ -214,7 +214,7 @@ They take several minutes to complete:: rm geog_high_res_mandatory.tar.gz Create the Run Folder -------------------------- +--------------------- The simulation is performed using a script that must first be downloaded. The script expects to run in a folder where it can download data files and create result files. @@ -228,7 +228,7 @@ then change its permissions so it can be run:: chmod 775 matthew/run.sh Run I-WRF -=========== +========= With everything in place, you are now ready to run the Docker container that will perform the simulation. The downloaded script runs inside the container, prints lots of status information, @@ -263,3 +263,4 @@ The output should look something like this:: Timing for main: time 2016-10-06_12:00:00 on domain 1: 0.23407 elapsed seconds Timing for Writing wrfout_d01_2016-10-06_12:00:00 for domain 1: 0.32534 elapsed seconds d01 2016-10-06_12:00:00 wrf: SUCCESS COMPLETE WRF + diff --git a/docs/Users_Guide/usecases.rst b/docs/Users_Guide/usecases.rst index 0a59d21..aea980d 100644 --- a/docs/Users_Guide/usecases.rst +++ b/docs/Users_Guide/usecases.rst @@ -5,8 +5,10 @@ Use Cases Generic CONUS “interesting weather” =================================== -`Hurricane Matthew on Jetstream2 ` -==================================================== +Hurricane Matthew on Jetstream2 +=============================== + +Navigate to :ref:`matthewjetstream` for more information. Land Use/Land Cover Change ========================== From bdcc262bcaa387f0124219248d3f88dfe1be1ff8 Mon Sep 17 00:00:00 2001 From: George McCabe <23407799+georgemccabe@users.noreply.github.com> Date: Wed, 29 May 2024 13:35:26 -0600 Subject: [PATCH 13/36] added ID so heading can be linked using :ref: --- docs/Users_Guide/matthewjetstream.rst | 2 ++ 1 file changed, 2 insertions(+) diff --git a/docs/Users_Guide/matthewjetstream.rst b/docs/Users_Guide/matthewjetstream.rst index 3a1a474..0f651b0 100644 --- a/docs/Users_Guide/matthewjetstream.rst +++ b/docs/Users_Guide/matthewjetstream.rst @@ -1,3 +1,5 @@ +.. _matthewjetstream: + Running I-WRF On Jetstream2 with Hurricane Matthew Data ******************************************************* From e194514edf41fce49ad4fcca2f0fdc561595e6f7 Mon Sep 17 00:00:00 2001 From: George McCabe <23407799+georgemccabe@users.noreply.github.com> Date: Wed, 29 May 2024 13:42:18 -0600 Subject: [PATCH 14/36] use double underscores to create anonymous reference for links that have the same text --- docs/Users_Guide/matthewjetstream.rst | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/docs/Users_Guide/matthewjetstream.rst b/docs/Users_Guide/matthewjetstream.rst index 0f651b0..4942ca5 100644 --- a/docs/Users_Guide/matthewjetstream.rst +++ b/docs/Users_Guide/matthewjetstream.rst @@ -62,9 +62,9 @@ Log in to the Exosphere Web Site Once you have an ACCESS account and allocation, you can log in to their `Exosphere web dashboard `_. The process of identifying your allocation and ACCESS ID to use Jetstream2 -is described on `this page `_ of the +is described on `this page `__ of the `Introduction to Jetstream2 `_ Cornell Virtual Workshop, -and on `this page `_ +and on `this page `__ of the `Jetstream2 documentation `_. While adding an allocation to your account, it is recommended that you choose @@ -76,7 +76,7 @@ Create a Cloud Instance and Log In After you have logged in to Jetstream2 and added your allocation to your account, you are ready to create the cloud instance where you will run the I-WRF simulation. If you are not familiar with the cloud computing terms "image" and "instance", -it is recommended that you `read about them `_ +it is recommended that you `read about them `__ before proceeding. Create an SSH Key @@ -86,7 +86,7 @@ You must upload a public SSH key to Jetstream2 before creating your instance. Jetstream2 injects that public key into the instance's default user account, and you will need to provide the matching private SSH key to log in to the instance. If you are not familiar with "SSH key pairs", you should -`read about them `_ before continuing. +`read about them `__ before continuing. * First, `create an SSH Key on your computer `_ using the "ssh-keygen" command. That command allows you to specify the name and location of the private key file it creates, with the default being "id_rsa". The matching public key file is saved to the same location and name with ".pub" appended to the filename. Later instructions will assume that your private key file is named "id_rsa", but you may choose a different name now and use that name in those later instructions. * Then, `upload the public key to Jetstream2 `_ through the Exosphere web interface. From 54c3b295756a1069b80e84aea92a98b229fc0fe5 Mon Sep 17 00:00:00 2001 From: George McCabe <23407799+georgemccabe@users.noreply.github.com> Date: Wed, 29 May 2024 13:46:32 -0600 Subject: [PATCH 15/36] updated versions of actions to prevent node.js deprecated warnings --- .github/workflows/documentation.yml | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/.github/workflows/documentation.yml b/.github/workflows/documentation.yml index b2a40e9..d5b030f 100644 --- a/.github/workflows/documentation.yml +++ b/.github/workflows/documentation.yml @@ -17,8 +17,8 @@ jobs: name: Documentation runs-on: ubuntu-latest steps: - - uses: actions/checkout@v3 - - uses: actions/setup-python@v4 + - uses: actions/checkout@v4 + - uses: actions/setup-python@v5 with: python-version: '3.8' - name: Install dependencies @@ -27,12 +27,12 @@ jobs: python -m pip install python-dateutil requests - name: Build Documentation run: ./.github/jobs/build_documentation.sh - - uses: actions/upload-artifact@v3 + - uses: actions/upload-artifact@v4 if: always() with: name: i-wrf_documentation path: artifact/documentation - - uses: actions/upload-artifact@v3 + - uses: actions/upload-artifact@v4 if: failure() with: name: documentation_warnings.log From 26ba3de9bfb74cbe21bb5b0d2b1f142a4038dec9 Mon Sep 17 00:00:00 2001 From: George McCabe <23407799+georgemccabe@users.noreply.github.com> Date: Wed, 29 May 2024 13:55:55 -0600 Subject: [PATCH 16/36] added orphan identifier to prevent warning that this page is not included in the toctree --- docs/Users_Guide/matthewjetstream.rst | 2 ++ 1 file changed, 2 insertions(+) diff --git a/docs/Users_Guide/matthewjetstream.rst b/docs/Users_Guide/matthewjetstream.rst index 4942ca5..dc8367f 100644 --- a/docs/Users_Guide/matthewjetstream.rst +++ b/docs/Users_Guide/matthewjetstream.rst @@ -1,3 +1,5 @@ +:orphan: + .. _matthewjetstream: Running I-WRF On Jetstream2 with Hurricane Matthew Data From 1bf8a00f7a864e31df22d6a960cde97d39e32885 Mon Sep 17 00:00:00 2001 From: George McCabe <23407799+georgemccabe@users.noreply.github.com> Date: Wed, 29 May 2024 14:00:29 -0600 Subject: [PATCH 17/36] fixed typos --- docs/Users_Guide/matthewjetstream.rst | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/docs/Users_Guide/matthewjetstream.rst b/docs/Users_Guide/matthewjetstream.rst index dc8367f..8f0ae79 100644 --- a/docs/Users_Guide/matthewjetstream.rst +++ b/docs/Users_Guide/matthewjetstream.rst @@ -22,7 +22,7 @@ but a cloud computing platform can provided the needed computational power. Jetstream2 is a national cyberinfrastructure resource that is easy to use and is available to researchers and educators. This exercise runs the I-WRF program as a Docker "container", -which simplifyies the set-up work needed to run the simulation. +which simplifies the set-up work needed to run the simulation. It is recommended that you follow the instructions in each section in the order presented to avoid encountering issues during the process. @@ -102,7 +102,7 @@ While following those steps, be sure to make the following choices for this inst * When choosing an image as the instance source, if viewing "By Type", select the "Ubuntu 22.04 (latest)" image. If viewing "By Image", choose the "Featured-Ubuntu22" image. * Choose the "Flavor" m3.quad (4 CPUs) to provide a faster simulation run-time. -* Select a custom disk size of 100 GB - larege enough to hold this exercise's data and results. +* Select a custom disk size of 100 GB - large enough to hold this exercise's data and results. * Select the SSH public key that you uploaded previously. * You do not need to set any of the Advanced Options. @@ -245,7 +245,7 @@ The command has numerous arguments and options, which do the following: * ``time docker run`` prints the runtime of the "docker run" command. * ``--shm-size 14G -it`` tells the command how much shared memory to use, and to run interactively in the shell. -* The ``-v`` options map folders in your cloud instance to paths within the contianer. +* The ``-v`` options map folders in your cloud instance to paths within the container. * ``ncar/iwrf:latest`` is the Docker image to use when creating the container. * ``/tmp/hurricane_matthew/run.sh`` is the location within the container of the script that it runs. From bc4e97d6c33339cfd8b944629e61380d81bf1f02 Mon Sep 17 00:00:00 2001 From: Ben Trumbore Date: Wed, 26 Jun 2024 13:53:46 -0400 Subject: [PATCH 18/36] Refactor the existing doc in preparation for adding METPlus doc --- docs/Users_Guide/matthewjetstream.rst | 82 ++++++++++++++++----------- 1 file changed, 50 insertions(+), 32 deletions(-) diff --git a/docs/Users_Guide/matthewjetstream.rst b/docs/Users_Guide/matthewjetstream.rst index 8f0ae79..681ba85 100644 --- a/docs/Users_Guide/matthewjetstream.rst +++ b/docs/Users_Guide/matthewjetstream.rst @@ -8,21 +8,24 @@ Running I-WRF On Jetstream2 with Hurricane Matthew Data Overview ======== -The following instructions can be used to run -the `I-WRF weather simulation program `_ +The following instructions can be used to run elements of +the `I-WRF weather simulation framework `_ from the `National Center for Atmospheric Research (NCAR) `_ +and the `Cornell Center for Advanced Computing `_. +The steps below run the `Weather Research & Forecasting (WRF) model `_ +and the `METPlus `_ verification framework with data from `Hurricane Matthew `_ on the `Jetstream2 cloud computing platform `_. This exercise provides an introduction to using cloud computing platforms, -running computationally complex simulations and using containerized applications. +running computationally complex simulations and analyses, and using containerized applications. -Simulations like I-WRF often require greater computing resources +Simulations like WRF often require greater computing resources than you may have on your personal computer, but a cloud computing platform can provided the needed computational power. Jetstream2 is a national cyberinfrastructure resource that is easy to use and is available to researchers and educators. -This exercise runs the I-WRF program as a Docker "container", -which simplifies the set-up work needed to run the simulation. +This exercise runs the I-WRF programs as Docker "containers", +which simplifies the set-up work needed to run the simulation and analysis. It is recommended that you follow the instructions in each section in the order presented to avoid encountering issues during the process. @@ -76,7 +79,7 @@ Create a Cloud Instance and Log In ================================== After you have logged in to Jetstream2 and added your allocation to your account, -you are ready to create the cloud instance where you will run the I-WRF simulation. +you are ready to create the cloud instance where you will run the simulation and analysis. If you are not familiar with the cloud computing terms "image" and "instance", it is recommended that you `read about them `__ before proceeding. @@ -123,7 +126,7 @@ In either case you will need to know the location and name of the private SSH ke the IP address of your instance (found in the Exosphere web dashboard) and the default username on your instance, which is "exouser". -Once you are logged in to the web shell you can proceed to the +Once you are logged in to the instance you can proceed to the "Install Software and Download Data" section below. You will know that your login has been successful when the prompt has the form ``exouser@instance-name:~$``, which indicates your username, the instance name, and your current working directory, followed by "$" @@ -153,31 +156,31 @@ so Shelving as soon as you are done becomes even more important! Install Software and Download Data ================================== -With your instance created and running and you logged in to it through a Web Shell, -you can now install the necessary software and download the data to run the simulation. +With your instance created and running and you logged in to it through SSH, +you can now install the necessary software and download the data to run the simulation and analysis. You will only need to perform these steps once, as they essentially change the contents of the instance's disk and those changes will remain even after the instance is shelved and unshelved. -The following sections instruct you to issue numerous Linux commands in your web shell. +The following sections instruct you to issue numerous Linux commands in your shell. If you are not familiar with Linux, you may want to want to refer to `An Introduction to Linux `_ when working through these steps. The commands in each section can be copied using the button in the upper right corner -and then pasted into your web shell by right-clicking. +and then pasted into your shell by right-clicking. -If your web shell ever becomes unresponsive or disconnected from the instance, +If your shell ever becomes unresponsive or disconnected from the instance, you can recover from that situation by rebooting the instance. In the Exosphere dashboard page for your instance, in the Actions menu, select "Reboot". The process takes several minutes, after which the instance status will return to "Ready". -Install Docker and Get the I-WRF Image --------------------------------------- +Install Docker and Get the WRF and METPlus Docker Images +-------------------------------------------------------- -As mentioned above, the I-WRF simulation application is provided as a Docker image that will run as a +As mentioned above, the WRF simulation and METPlus analysis applications are provided as Docker images that will run as a `"container" `_ on your cloud instance. To run a Docker container, you must first install the Docker Engine on your instance. -You can then "pull" (download) the I-WRF image that will be run as a container. +You can then "pull" (download) the WRF and METPlus images that will be run as containers. The `instructions for installing Docker Engine on Ubuntu `_ are very thorough and make a good reference, but we only need to perform a subset of those steps. @@ -200,14 +203,18 @@ If that command appeared to succeed, you can confirm its status with this comman sudo systemctl --no-pager status docker -Once all of that is in order, you must pull the latest version of the I-WRF image onto your instance:: +Once all of that is in order, you must pull the latest versions of the WRF and METPlus images onto your instance. +We define environment variables here and elsewhere to ensure consistent IDs for containers and folders:: - docker pull ncar/iwrf + WRF_IMAGE=ncar/iwrf:latest + METPLUS_IMAGE=dtcenter/metplus-dev:develop + docker pull ${WRF_IMAGE} + docker pull ${METPLUS_IMAGE} Get the Geographic Data ----------------------- -To run I-WRF on the Hurricane Matthew data set, you need a copy of the +To run WRF on the Hurricane Matthew data set, you need a copy of the geographic data representing the terrain in the area of the simulation. These commands download an archive file containing that data, uncompress the archive into a folder named "WPS_GEOG", and delete the archive file. @@ -217,29 +224,38 @@ They take several minutes to complete:: tar -xzf geog_high_res_mandatory.tar.gz rm geog_high_res_mandatory.tar.gz -Create the Run Folder ---------------------- +Create the WRF Run Folder +------------------------- The simulation is performed using a script that must first be downloaded. The script expects to run in a folder where it can download data files and create result files. -The instructions in this exercise create that folder in the user's home directory and name it "matthew". +The instructions in this exercise create a folder (named "wrf") under the user's home directory, +and a sub-folder within "wrf" to hold the output of this simulation. +The subfolder is named "20161006_00", the beginning date and time of the simulatition. The simulation script is called "run.sh". -The following commands create the empty folder and download the script into it, +The following commands create the empty folders and download the script into them, then change its permissions so it can be run:: - mkdir matthew - curl --location https://bit.ly/3KoBtRK > matthew/run.sh - chmod 775 matthew/run.sh + WRF_DIR=/home/exouser/wrf/20161006_00 + mkdir -p ${WRF_DIR} + curl --location https://bit.ly/3KoBtRK > ${WRF_DIR}/run.sh + chmod 775 ${WRF_DIR}/run.sh -Run I-WRF -========= +Get the Observed Weather Data +----------------------------- + +Set up the METPlus Directories +------------------------------ + +Run WRF +======= With everything in place, you are now ready to run the Docker container that will perform the simulation. The downloaded script runs inside the container, prints lots of status information, and creates output files in the run folder you created. -Execute this command to run the simulation in your web shell:: +Execute this command to run the simulation in your shell:: - time docker run --shm-size 14G -it -v ~/:/home/wrfuser/terrestrial_data -v ~/matthew:/tmp/hurricane_matthew ncar/iwrf:latest /tmp/hurricane_matthew/run.sh + time docker run --shm-size 14G -it -v ~/:/home/wrfuser/terrestrial_data -v ${WRF_DIR}:/tmp/hurricane_matthew ${WRF_IMAGE} /tmp/hurricane_matthew/run.sh The command has numerous arguments and options, which do the following: @@ -250,7 +266,7 @@ The command has numerous arguments and options, which do the following: * ``/tmp/hurricane_matthew/run.sh`` is the location within the container of the script that it runs. The simulation initially prints lots of information while initializing things, then settles in to the computation. -The provided configuration simulates 12 hours of weather and takes under three minutes to finish on an m3.quad Jetstream2 instance. +The provided configuration simulates 48 hours of weather and takes about 12 minutes to finish on an m3.quad Jetstream2 instance. Once completed, you can view the end of any of the output files to confirm that it succeeded:: tail matthew/rsl.out.0000 @@ -268,3 +284,5 @@ The output should look something like this:: Timing for Writing wrfout_d01_2016-10-06_12:00:00 for domain 1: 0.32534 elapsed seconds d01 2016-10-06_12:00:00 wrf: SUCCESS COMPLETE WRF +Run METPlus +=========== From 14b22a43505adca60d44acfd4845e534fb121244 Mon Sep 17 00:00:00 2001 From: Ben Trumbore Date: Wed, 26 Jun 2024 14:38:57 -0400 Subject: [PATCH 19/36] Add initial version of METPlus instructions These steps were migrated from work by George, modified to fit with the WRF instructions already present here. --- docs/Users_Guide/matthewjetstream.rst | 55 ++++++++++++++++++++++----- 1 file changed, 46 insertions(+), 9 deletions(-) diff --git a/docs/Users_Guide/matthewjetstream.rst b/docs/Users_Guide/matthewjetstream.rst index 681ba85..f6a09e4 100644 --- a/docs/Users_Guide/matthewjetstream.rst +++ b/docs/Users_Guide/matthewjetstream.rst @@ -12,7 +12,7 @@ The following instructions can be used to run elements of the `I-WRF weather simulation framework `_ from the `National Center for Atmospheric Research (NCAR) `_ and the `Cornell Center for Advanced Computing `_. -The steps below run the `Weather Research & Forecasting (WRF) model `_ +The steps below run the `Weather Research & Forecasting (WRF) `_ model and the `METPlus `_ verification framework with data from `Hurricane Matthew `_ on the `Jetstream2 cloud computing platform `_. @@ -129,7 +129,7 @@ and the default username on your instance, which is "exouser". Once you are logged in to the instance you can proceed to the "Install Software and Download Data" section below. You will know that your login has been successful when the prompt has the form ``exouser@instance-name:~$``, -which indicates your username, the instance name, and your current working directory, followed by "$" +which indicates your username, the instance name, and your current working directory, followed by "$". Managing a Jetstream2 Instance ------------------------------ @@ -203,8 +203,8 @@ If that command appeared to succeed, you can confirm its status with this comman sudo systemctl --no-pager status docker -Once all of that is in order, you must pull the latest versions of the WRF and METPlus images onto your instance. -We define environment variables here and elsewhere to ensure consistent IDs for containers and folders:: +Once all of that is in order, you must pull the correct versions of the WRF and METPlus images onto your instance. +We define environment variables here and below to ensure that consistent IDs are used for containers and folders:: WRF_IMAGE=ncar/iwrf:latest METPLUS_IMAGE=dtcenter/metplus-dev:develop @@ -217,7 +217,7 @@ Get the Geographic Data To run WRF on the Hurricane Matthew data set, you need a copy of the geographic data representing the terrain in the area of the simulation. These commands download an archive file containing that data, -uncompress the archive into a folder named "WPS_GEOG", and delete the archive file. +uncompress the archive into a folder named "WPS_GEOG" in your home directory, and delete the archive file. They take several minutes to complete:: wget https://www2.mmm.ucar.edu/wrf/src/wps_files/geog_high_res_mandatory.tar.gz @@ -231,12 +231,13 @@ The simulation is performed using a script that must first be downloaded. The script expects to run in a folder where it can download data files and create result files. The instructions in this exercise create a folder (named "wrf") under the user's home directory, and a sub-folder within "wrf" to hold the output of this simulation. -The subfolder is named "20161006_00", the beginning date and time of the simulatition. +The subfolder is named "20161006_00", which is the beginning date and time of the simulatition. The simulation script is called "run.sh". The following commands create the empty folders and download the script into them, then change its permissions so it can be run:: - WRF_DIR=/home/exouser/wrf/20161006_00 + WORKING_DIR=/home/exouser + WRF_DIR=${WORKING_DIR}/wrf/20161006_00 mkdir -p ${WRF_DIR} curl --location https://bit.ly/3KoBtRK > ${WRF_DIR}/run.sh chmod 775 ${WRF_DIR}/run.sh @@ -244,9 +245,30 @@ then change its permissions so it can be run:: Get the Observed Weather Data ----------------------------- +The METPlus analysis will be comparing the results of the WRF simulation against +the actual weather data that was recorded during Hurricane Matthew. +We will download that data by pulling a Docker volume that holds it, +and then referencing that volume when we run the METPlus Docker container. +The commands to pull and create the volume are:: + + OBS_DATA_VOL=data-matthew-input-obs + docker pull ncar/iwrf:data-matthew-input-obs.docker + docker create --name ${OBS_DATA_VOL} ncar/iwrf:data-matthew-input-obs.docker + Set up the METPlus Directories ------------------------------ +METPlus requires a folder into which it can download files and write its output, so we will create one:: + + METPLUS_DIR=${WORKING_DIR}/metplus + mkdir -p ${METPLUS_DIR} + +It also needs some configuration files to direct its behavior, which we are contained in +the I-WRF GitHub repository and can be downloaded with these commands: + + git clone https://github.com/NCAR/i-wrf ${WORKING_DIR}/i-wrf + METPLUS_CONFIG_DIR=${WORKING_DIR}/i-wrf/use_cases/Hurricane_Matthew/METplus + Run WRF ======= @@ -255,7 +277,7 @@ The downloaded script runs inside the container, prints lots of status informati and creates output files in the run folder you created. Execute this command to run the simulation in your shell:: - time docker run --shm-size 14G -it -v ~/:/home/wrfuser/terrestrial_data -v ${WRF_DIR}:/tmp/hurricane_matthew ${WRF_IMAGE} /tmp/hurricane_matthew/run.sh + time docker run --shm-size 14G -it -v ${WORKING_DIR}:/home/wrfuser/terrestrial_data -v ${WRF_DIR}:/tmp/hurricane_matthew ${WRF_IMAGE} /tmp/hurricane_matthew/run.sh The command has numerous arguments and options, which do the following: @@ -269,7 +291,7 @@ The simulation initially prints lots of information while initializing things, t The provided configuration simulates 48 hours of weather and takes about 12 minutes to finish on an m3.quad Jetstream2 instance. Once completed, you can view the end of any of the output files to confirm that it succeeded:: - tail matthew/rsl.out.0000 + tail ${WRF_DIR}/rsl.out.0000 The output should look something like this:: @@ -286,3 +308,18 @@ The output should look something like this:: Run METPlus =========== + +After the WRF simulation has finished, you can run the METPlus analysis to compare the simulated results +to the actual weather observations during the hurricane. +We use command line options to tell the METPlus container several things, including where the observed data is located, +where the METPlus configuration can be found, where the WRF output data is located, and where it should create its output files:: + + docker run --rm -it --volumes-from ${OBS_DATA_VOL} -v $METPLUS_CONFIG_DIR:/config -v ${WRF_DIR}:/data/input/wrf -v ${METPLUS_DIR}:/data/output ${METPLUS_IMAGE} /metplus/METplus/ush/run_metplus.py /config/PointStat_matthew.conf + +As the analysis is performed, progress information is displayed. It is not uncommon to see "WARNING" messages in this output, +but you should only be alarmed if you see messages with the text "ERROR". +METPlus first makes two passes over each of the 48 hourly observation time-slices, +converting data files to a suitable format for the analysis. +It then performs statistical analysis on the data from the earth's surface and on the pressure level data. + +TBD: If no errors reported, how to see if successful and/or view results. From 36f08fe6bd4a4bba8b0187f3efcb10767a877b4a Mon Sep 17 00:00:00 2001 From: Ben Trumbore Date: Wed, 26 Jun 2024 15:57:35 -0400 Subject: [PATCH 20/36] Tweaks to documentation after full testing The final upper_air analysis still fails due to a lack of pressure level data files from the WRF simulation. --- docs/Users_Guide/matthewjetstream.rst | 20 +++++++++++++------- 1 file changed, 13 insertions(+), 7 deletions(-) diff --git a/docs/Users_Guide/matthewjetstream.rst b/docs/Users_Guide/matthewjetstream.rst index f6a09e4..0badf69 100644 --- a/docs/Users_Guide/matthewjetstream.rst +++ b/docs/Users_Guide/matthewjetstream.rst @@ -195,13 +195,18 @@ When the installation is complete, you can verify that the Docker command line t docker --version -Next, you must start the Docker daemon, which runs in the background and processes commands:: +The Docker daemon should start automatically, but it sometimes runs into issues. +First, check to see if the daemon started successfully:: - sudo service docker start + sudo systemctl --no-pager status docker -If that command appeared to succeed, you can confirm its status with this command:: +If you see a message saying the daemon failed to start because a "Start request repeated too quickly", +wait a few minutes and issue this command to try again to start it:: - sudo systemctl --no-pager status docker + sudo systemctl start docker + +If the command seems to succeed, confirm that the daemon is running using the status command above, +and repeat these efforts as necessary until it is started. Once all of that is in order, you must pull the correct versions of the WRF and METPlus images onto your instance. We define environment variables here and below to ensure that consistent IDs are used for containers and folders:: @@ -239,7 +244,7 @@ then change its permissions so it can be run:: WORKING_DIR=/home/exouser WRF_DIR=${WORKING_DIR}/wrf/20161006_00 mkdir -p ${WRF_DIR} - curl --location https://bit.ly/3KoBtRK > ${WRF_DIR}/run.sh + curl --location https://bit.ly/4ccWsTY > ${WRF_DIR}/run.sh chmod 775 ${WRF_DIR}/run.sh Get the Observed Weather Data @@ -264,7 +269,7 @@ METPlus requires a folder into which it can download files and write its output, mkdir -p ${METPLUS_DIR} It also needs some configuration files to direct its behavior, which we are contained in -the I-WRF GitHub repository and can be downloaded with these commands: +the I-WRF GitHub repository and can be downloaded with these commands:: git clone https://github.com/NCAR/i-wrf ${WORKING_DIR}/i-wrf METPLUS_CONFIG_DIR=${WORKING_DIR}/i-wrf/use_cases/Hurricane_Matthew/METplus @@ -311,10 +316,11 @@ Run METPlus After the WRF simulation has finished, you can run the METPlus analysis to compare the simulated results to the actual weather observations during the hurricane. +The analysis takes about five minutes to complete. We use command line options to tell the METPlus container several things, including where the observed data is located, where the METPlus configuration can be found, where the WRF output data is located, and where it should create its output files:: - docker run --rm -it --volumes-from ${OBS_DATA_VOL} -v $METPLUS_CONFIG_DIR:/config -v ${WRF_DIR}:/data/input/wrf -v ${METPLUS_DIR}:/data/output ${METPLUS_IMAGE} /metplus/METplus/ush/run_metplus.py /config/PointStat_matthew.conf + docker run --rm -it --volumes-from ${OBS_DATA_VOL} -v $METPLUS_CONFIG_DIR:/config -v ${WORKING_DIR}/wrf:/data/input/wrf -v ${METPLUS_DIR}:/data/output ${METPLUS_IMAGE} /metplus/METplus/ush/run_metplus.py /config/PointStat_matthew.conf As the analysis is performed, progress information is displayed. It is not uncommon to see "WARNING" messages in this output, but you should only be alarmed if you see messages with the text "ERROR". From 3acf6696c0421bce0236f7aefc06c776cb70b246 Mon Sep 17 00:00:00 2001 From: Ben Trumbore Date: Thu, 27 Jun 2024 14:07:03 -0400 Subject: [PATCH 21/36] Get METPlus working Also moved around some of the commands and updated some text. --- docs/Users_Guide/matthewjetstream.rst | 93 +++++++++++++++------------ 1 file changed, 51 insertions(+), 42 deletions(-) diff --git a/docs/Users_Guide/matthewjetstream.rst b/docs/Users_Guide/matthewjetstream.rst index 0badf69..72718bb 100644 --- a/docs/Users_Guide/matthewjetstream.rst +++ b/docs/Users_Guide/matthewjetstream.rst @@ -173,8 +173,24 @@ you can recover from that situation by rebooting the instance. In the Exosphere dashboard page for your instance, in the Actions menu, select "Reboot". The process takes several minutes, after which the instance status will return to "Ready". -Install Docker and Get the WRF and METPlus Docker Images --------------------------------------------------------- +We will be using some environment variables throughout this exercise to +make sure that we use the same resource names and file paths wherever they are used. +Copy and paste the definitions below into your shell to define the variables before proceeding:: + + WRF_IMAGE=ncar/iwrf:latest + METPLUS_IMAGE=dtcenter/metplus-dev:develop + WORKING_DIR=/home/exouser + WRF_DIR=${WORKING_DIR}/wrf/20161006_00 + METPLUS_DIR=${WORKING_DIR}/metplus + WRF_CONFIG_DIR=${WORKING_DIR}/i-wrf/use_cases/Hurricane_Matthew/WRF + METPLUS_CONFIG_DIR=${WORKING_DIR}/i-wrf/use_cases/Hurricane_Matthew/METplus + OBS_DATA_VOL=data-matthew-input-obs + +Any time you open a new shell on your instance, you will need to perform this action +to redefine the variables before executing the commands that follow. + +Install Docker +-------------- As mentioned above, the WRF simulation and METPlus analysis applications are provided as Docker images that will run as a `"container" `_ @@ -208,14 +224,23 @@ wait a few minutes and issue this command to try again to start it:: If the command seems to succeed, confirm that the daemon is running using the status command above, and repeat these efforts as necessary until it is started. -Once all of that is in order, you must pull the correct versions of the WRF and METPlus images onto your instance. -We define environment variables here and below to ensure that consistent IDs are used for containers and folders:: +Get the WRF and METPlus Docker Images and the Observed Weather Data +------------------------------------------------------------------- + +Once Docker is running, you must pull the correct versions of the WRF and METPlus images onto your instance:: - WRF_IMAGE=ncar/iwrf:latest - METPLUS_IMAGE=dtcenter/metplus-dev:develop docker pull ${WRF_IMAGE} docker pull ${METPLUS_IMAGE} +The METPlus analysis will be comparing the results of the WRF simulation against +the actual weather data that was recorded during Hurricane Matthew. +We download that data by pulling a Docker volume that holds it, +and then referencing that volume when we run the METPlus Docker container. +The commands to pull and create the volume are:: + + docker pull ncar/iwrf:data-matthew-input-obs.docker + docker create --name ${OBS_DATA_VOL} ncar/iwrf:data-matthew-input-obs.docker + Get the Geographic Data ----------------------- @@ -229,8 +254,8 @@ They take several minutes to complete:: tar -xzf geog_high_res_mandatory.tar.gz rm geog_high_res_mandatory.tar.gz -Create the WRF Run Folder -------------------------- +Create the WRF and METPlus Run Folders +-------------------------------------- The simulation is performed using a script that must first be downloaded. The script expects to run in a folder where it can download data files and create result files. @@ -238,41 +263,26 @@ The instructions in this exercise create a folder (named "wrf") under the user's and a sub-folder within "wrf" to hold the output of this simulation. The subfolder is named "20161006_00", which is the beginning date and time of the simulatition. The simulation script is called "run.sh". -The following commands create the empty folders and download the script into them, -then change its permissions so it can be run:: +Similarly, a run folder named "metplus" must be created for the METPlus process to use. +The following commands create the empty folders and download the script +and change its permissions so it can be run:: - WORKING_DIR=/home/exouser - WRF_DIR=${WORKING_DIR}/wrf/20161006_00 mkdir -p ${WRF_DIR} - curl --location https://bit.ly/4ccWsTY > ${WRF_DIR}/run.sh + curl --location https://bit.ly/3RLmo0F > ${WRF_DIR}/run.sh chmod 775 ${WRF_DIR}/run.sh - -Get the Observed Weather Data ------------------------------ - -The METPlus analysis will be comparing the results of the WRF simulation against -the actual weather data that was recorded during Hurricane Matthew. -We will download that data by pulling a Docker volume that holds it, -and then referencing that volume when we run the METPlus Docker container. -The commands to pull and create the volume are:: - - OBS_DATA_VOL=data-matthew-input-obs - docker pull ncar/iwrf:data-matthew-input-obs.docker - docker create --name ${OBS_DATA_VOL} ncar/iwrf:data-matthew-input-obs.docker - -Set up the METPlus Directories ------------------------------- - -METPlus requires a folder into which it can download files and write its output, so we will create one:: - - METPLUS_DIR=${WORKING_DIR}/metplus mkdir -p ${METPLUS_DIR} -It also needs some configuration files to direct its behavior, which we are contained in -the I-WRF GitHub repository and can be downloaded with these commands:: +Download Configuration Files +---------------------------- + +Both WRF and METPlus require some configuration files to direct their behavior, +and those are downloaded from the I-WRF GitHub repository. +Some of those configuration files must also be copied into run folders. +These commands perform the necessary operations:: git clone https://github.com/NCAR/i-wrf ${WORKING_DIR}/i-wrf - METPLUS_CONFIG_DIR=${WORKING_DIR}/i-wrf/use_cases/Hurricane_Matthew/METplus + cp ${WRF_CONFIG_DIR}/var_io.txt ${WRF_DIR} + curl --location https://bit.ly/4eKpb47 > ${WRF_DIR}/namelist.input.template Run WRF ======= @@ -282,11 +292,11 @@ The downloaded script runs inside the container, prints lots of status informati and creates output files in the run folder you created. Execute this command to run the simulation in your shell:: - time docker run --shm-size 14G -it -v ${WORKING_DIR}:/home/wrfuser/terrestrial_data -v ${WRF_DIR}:/tmp/hurricane_matthew ${WRF_IMAGE} /tmp/hurricane_matthew/run.sh + docker run --shm-size 14G -it -v ${WORKING_DIR}:/home/wrfuser/terrestrial_data -v ${WRF_DIR}:/tmp/hurricane_matthew ${WRF_IMAGE} /tmp/hurricane_matthew/run.sh The command has numerous arguments and options, which do the following: -* ``time docker run`` prints the runtime of the "docker run" command. +* ``docker run`` creates the container if needed and then runs it. * ``--shm-size 14G -it`` tells the command how much shared memory to use, and to run interactively in the shell. * The ``-v`` options map folders in your cloud instance to paths within the container. * ``ncar/iwrf:latest`` is the Docker image to use when creating the container. @@ -323,9 +333,8 @@ where the METPlus configuration can be found, where the WRF output data is locat docker run --rm -it --volumes-from ${OBS_DATA_VOL} -v $METPLUS_CONFIG_DIR:/config -v ${WORKING_DIR}/wrf:/data/input/wrf -v ${METPLUS_DIR}:/data/output ${METPLUS_IMAGE} /metplus/METplus/ush/run_metplus.py /config/PointStat_matthew.conf As the analysis is performed, progress information is displayed. It is not uncommon to see "WARNING" messages in this output, -but you should only be alarmed if you see messages with the text "ERROR". +and you should only be alarmed if you see messages with the text "ERROR". METPlus first makes two passes over each of the 48 hourly observation time-slices, converting data files to a suitable format for the analysis. -It then performs statistical analysis on the data from the earth's surface and on the pressure level data. - -TBD: If no errors reported, how to see if successful and/or view results. +It then performs statistical analysis on the data from the earth's surface and from the "upper air". +METPlus will print its completion status when the processing finishes. From c409ad68e722fd3d2f6ecead271ac5d73841a2b1 Mon Sep 17 00:00:00 2001 From: Ben Trumbore Date: Thu, 27 Jun 2024 14:33:21 -0400 Subject: [PATCH 22/36] Edits from final testing pass on new and revised content --- docs/Users_Guide/matthewjetstream.rst | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/docs/Users_Guide/matthewjetstream.rst b/docs/Users_Guide/matthewjetstream.rst index 72718bb..f0a2f05 100644 --- a/docs/Users_Guide/matthewjetstream.rst +++ b/docs/Users_Guide/matthewjetstream.rst @@ -221,8 +221,8 @@ wait a few minutes and issue this command to try again to start it:: sudo systemctl start docker -If the command seems to succeed, confirm that the daemon is running using the status command above, -and repeat these efforts as necessary until it is started. +If the command seems to succeed, confirm that the daemon is running using the status command above. +Repeat these efforts as necessary until it is started. Get the WRF and METPlus Docker Images and the Observed Weather Data ------------------------------------------------------------------- @@ -281,7 +281,7 @@ Some of those configuration files must also be copied into run folders. These commands perform the necessary operations:: git clone https://github.com/NCAR/i-wrf ${WORKING_DIR}/i-wrf - cp ${WRF_CONFIG_DIR}/var_io.txt ${WRF_DIR} + cp ${WRF_CONFIG_DIR}/vars_io.txt ${WRF_DIR} curl --location https://bit.ly/4eKpb47 > ${WRF_DIR}/namelist.input.template Run WRF @@ -330,7 +330,7 @@ The analysis takes about five minutes to complete. We use command line options to tell the METPlus container several things, including where the observed data is located, where the METPlus configuration can be found, where the WRF output data is located, and where it should create its output files:: - docker run --rm -it --volumes-from ${OBS_DATA_VOL} -v $METPLUS_CONFIG_DIR:/config -v ${WORKING_DIR}/wrf:/data/input/wrf -v ${METPLUS_DIR}:/data/output ${METPLUS_IMAGE} /metplus/METplus/ush/run_metplus.py /config/PointStat_matthew.conf + docker run --rm -it --volumes-from ${OBS_DATA_VOL} -v ${METPLUS_CONFIG_DIR}:/config -v ${WORKING_DIR}/wrf:/data/input/wrf -v ${METPLUS_DIR}:/data/output ${METPLUS_IMAGE} /metplus/METplus/ush/run_metplus.py /config/PointStat_matthew.conf As the analysis is performed, progress information is displayed. It is not uncommon to see "WARNING" messages in this output, and you should only be alarmed if you see messages with the text "ERROR". From 88504c039ef46032880aadff252730d5ec7e880e Mon Sep 17 00:00:00 2001 From: Ben Trumbore Date: Thu, 27 Jun 2024 15:04:11 -0400 Subject: [PATCH 23/36] Final tweaks before creating a pull request --- docs/Users_Guide/matthewjetstream.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/Users_Guide/matthewjetstream.rst b/docs/Users_Guide/matthewjetstream.rst index f0a2f05..6884afe 100644 --- a/docs/Users_Guide/matthewjetstream.rst +++ b/docs/Users_Guide/matthewjetstream.rst @@ -268,7 +268,7 @@ The following commands create the empty folders and download the script and change its permissions so it can be run:: mkdir -p ${WRF_DIR} - curl --location https://bit.ly/3RLmo0F > ${WRF_DIR}/run.sh + curl --location https://bit.ly/3xzm9z6 > ${WRF_DIR}/run.sh chmod 775 ${WRF_DIR}/run.sh mkdir -p ${METPLUS_DIR} From 993df6dc2853dd498113e89466b430c968e59274 Mon Sep 17 00:00:00 2001 From: Ben Trumbore Date: Fri, 28 Jun 2024 09:59:09 -0400 Subject: [PATCH 24/36] Add text about viewing the output of METPlus --- docs/Users_Guide/matthewjetstream.rst | 7 +++++++ 1 file changed, 7 insertions(+) diff --git a/docs/Users_Guide/matthewjetstream.rst b/docs/Users_Guide/matthewjetstream.rst index 6884afe..3f2bc89 100644 --- a/docs/Users_Guide/matthewjetstream.rst +++ b/docs/Users_Guide/matthewjetstream.rst @@ -338,3 +338,10 @@ METPlus first makes two passes over each of the 48 hourly observation time-slice converting data files to a suitable format for the analysis. It then performs statistical analysis on the data from the earth's surface and from the "upper air". METPlus will print its completion status when the processing finishes. + +The results of the METPlus analysis are stored in the subfolders of ~/metplus. +Most of these files are not human readable, but those in the point_stat subfolder +contain tabular output that can be viewed in a text editor +(the rows are very long, so you may want to turn word wrapping off for better viewing). +In the near future, this exercise will be extended to include +a friendlier way to view the results from the simulation and analysis. From 62c6966ed48bd896061fd1c9b58803aafd5d8e41 Mon Sep 17 00:00:00 2001 From: George McCabe <23407799+georgemccabe@users.noreply.github.com> Date: Fri, 28 Jun 2024 10:49:12 -0600 Subject: [PATCH 25/36] changed METPlus to METplus --- docs/Users_Guide/configuration.rst | 2 +- docs/Users_Guide/matthewjetstream.rst | 34 +++++++++++++-------------- 2 files changed, 18 insertions(+), 18 deletions(-) diff --git a/docs/Users_Guide/configuration.rst b/docs/Users_Guide/configuration.rst index d7022e6..95a0225 100644 --- a/docs/Users_Guide/configuration.rst +++ b/docs/Users_Guide/configuration.rst @@ -64,7 +64,7 @@ Use this for the vars_io.txt file for the Hurricane Matthew case, from the Githu https://github.com/NCAR/i-wrf/blob/main/use_cases/Hurricane_Matthew/WRF/vars_io.txt ^^^^^^^^^^^^^^^^^^^ -METPlus Config File +METplus Config File ^^^^^^^^^^^^^^^^^^^ For the METplus configuration file for the Hurricane Matthew case, please use this file on the Github repository: diff --git a/docs/Users_Guide/matthewjetstream.rst b/docs/Users_Guide/matthewjetstream.rst index 99783ac..898dcb8 100644 --- a/docs/Users_Guide/matthewjetstream.rst +++ b/docs/Users_Guide/matthewjetstream.rst @@ -13,7 +13,7 @@ the `I-WRF weather simulation framework `_ from the `National Center for Atmospheric Research (NCAR) `_ and the `Cornell Center for Advanced Computing `_. The steps below run the `Weather Research & Forecasting (WRF) `_ model -and the `METPlus `_ verification framework +and the `METplus `_ verification framework with data from `Hurricane Matthew `_ on the `Jetstream2 cloud computing platform `_. This exercise provides an introduction to using cloud computing platforms, @@ -192,11 +192,11 @@ to redefine the variables before executing the commands that follow. Install Docker -------------- -As mentioned above, the WRF simulation and METPlus analysis applications are provided as Docker images that will run as a +As mentioned above, the WRF simulation and METplus analysis applications are provided as Docker images that will run as a `"container" `_ on your cloud instance. To run a Docker container, you must first install the Docker Engine on your instance. -You can then "pull" (download) the WRF and METPlus images that will be run as containers. +You can then "pull" (download) the WRF and METplus images that will be run as containers. The `instructions for installing Docker Engine on Ubuntu `_ are very thorough and make a good reference, but we only need to perform a subset of those steps. @@ -224,18 +224,18 @@ wait a few minutes and issue this command to try again to start it:: If the command seems to succeed, confirm that the daemon is running using the status command above. Repeat these efforts as necessary until it is started. -Get the WRF and METPlus Docker Images and the Observed Weather Data +Get the WRF and METplus Docker Images and the Observed Weather Data ------------------------------------------------------------------- -Once Docker is running, you must pull the correct versions of the WRF and METPlus images onto your instance:: +Once Docker is running, you must pull the correct versions of the WRF and METplus images onto your instance:: docker pull ${WRF_IMAGE} docker pull ${METPLUS_IMAGE} -The METPlus analysis will be comparing the results of the WRF simulation against +The METplus analysis will be comparing the results of the WRF simulation against the actual weather data that was recorded during Hurricane Matthew. We download that data by pulling a Docker volume that holds it, -and then referencing that volume when we run the METPlus Docker container. +and then referencing that volume when we run the METplus Docker container. The commands to pull and create the volume are:: docker pull ncar/iwrf:data-matthew-input-obs.docker @@ -254,7 +254,7 @@ They take several minutes to complete:: tar -xzf geog_high_res_mandatory.tar.gz rm geog_high_res_mandatory.tar.gz -Create the WRF and METPlus Run Folders +Create the WRF and METplus Run Folders -------------------------------------- The simulation is performed using a script that must first be downloaded. @@ -263,7 +263,7 @@ The instructions in this exercise create a folder (named "wrf") under the user's and a sub-folder within "wrf" to hold the output of this simulation. The subfolder is named "20161006_00", which is the beginning date and time of the simulatition. The simulation script is called "run.sh". -Similarly, a run folder named "metplus" must be created for the METPlus process to use. +Similarly, a run folder named "metplus" must be created for the METplus process to use. The following commands create the empty folders and download the script and change its permissions so it can be run:: @@ -275,7 +275,7 @@ and change its permissions so it can be run:: Download Configuration Files ---------------------------- -Both WRF and METPlus require some configuration files to direct their behavior, +Both WRF and METplus require some configuration files to direct their behavior, and those are downloaded from the I-WRF GitHub repository. Some of those configuration files must also be copied into run folders. These commands perform the necessary operations:: @@ -320,25 +320,25 @@ The output should look something like this:: Timing for Writing wrfout_d01_2016-10-06_12:00:00 for domain 1: 0.32534 elapsed seconds d01 2016-10-06_12:00:00 wrf: SUCCESS COMPLETE WRF -Run METPlus +Run METplus =========== -After the WRF simulation has finished, you can run the METPlus analysis to compare the simulated results +After the WRF simulation has finished, you can run the METplus analysis to compare the simulated results to the actual weather observations during the hurricane. The analysis takes about five minutes to complete. -We use command line options to tell the METPlus container several things, including where the observed data is located, -where the METPlus configuration can be found, where the WRF output data is located, and where it should create its output files:: +We use command line options to tell the METplus container several things, including where the observed data is located, +where the METplus configuration can be found, where the WRF output data is located, and where it should create its output files:: docker run --rm -it --volumes-from ${OBS_DATA_VOL} -v ${METPLUS_CONFIG_DIR}:/config -v ${WORKING_DIR}/wrf:/data/input/wrf -v ${METPLUS_DIR}:/data/output ${METPLUS_IMAGE} /metplus/METplus/ush/run_metplus.py /config/PointStat_matthew.conf As the analysis is performed, progress information is displayed. It is not uncommon to see "WARNING" messages in this output, and you should only be alarmed if you see messages with the text "ERROR". -METPlus first makes two passes over each of the 48 hourly observation time-slices, +METplus first makes two passes over each of the 48 hourly observation time-slices, converting data files to a suitable format for the analysis. It then performs statistical analysis on the data from the earth's surface and from the "upper air". -METPlus will print its completion status when the processing finishes. +METplus will print its completion status when the processing finishes. -The results of the METPlus analysis are stored in the subfolders of ~/metplus. +The results of the METplus analysis are stored in the subfolders of ~/metplus. Most of these files are not human readable, but those in the point_stat subfolder contain tabular output that can be viewed in a text editor (the rows are very long, so you may want to turn word wrapping off for better viewing). From 1c3a23c82a6a02bbe4ae08728df4e9f2b1d0bce7 Mon Sep 17 00:00:00 2001 From: George McCabe <23407799+georgemccabe@users.noreply.github.com> Date: Fri, 28 Jun 2024 11:24:55 -0600 Subject: [PATCH 26/36] split long docker run commands into multiple lines for better readability --- docs/.DS_Store | Bin 0 -> 6148 bytes docs/Users_Guide/matthewjetstream.rst | 12 ++++++++++-- 2 files changed, 10 insertions(+), 2 deletions(-) create mode 100644 docs/.DS_Store diff --git a/docs/.DS_Store b/docs/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..e3e0d17baaa8ac1241f0e56d180c32dc045a0347 GIT binary patch literal 6148 zcmeHK%}T>S5T0#on^J@x6g@6@E!ZC|h?h_+-i<#8m6(vA!I&*gYYwH5tGBIn^WX2)AH-2QYBt|kp;%fgSFDOvvu>PwkvSu8G){Zo@QPMv zLd5=T?)eu{n7Q?hBaw`}C<%u$APPcsxxS2&Kx92JPJ&d%I%>nJT2;5cHJP+q4Zh#) zOdEW1&~7&PUZ*>qR;|sQ-NRG&A%0B6v*MBAk(08naRx6qnM>)(878qv?$Ay#H8_V9 zLKx!c!ih{gfbi47{lkebN$v{;GpY<41`GqM%7DH=t=g&*VDdH$7zUOyK=%VjCA2l> z3gy*-gKPm1={Hge&Z(E67_QOQm@C8`6sAHERVdRf22KA%f>V75Xt8U_pl^9&T#vQG8?czORnA7oaB0mHz*Vt|$UPQQmK>Ds!G9MxKm tdW%Xze!0Sr5S++TjJ{Ngw^5~_ozn!-)|e|q3yS#>kTjUVFz}-cyaQgHjO+ja literal 0 HcmV?d00001 diff --git a/docs/Users_Guide/matthewjetstream.rst b/docs/Users_Guide/matthewjetstream.rst index 898dcb8..a83395c 100644 --- a/docs/Users_Guide/matthewjetstream.rst +++ b/docs/Users_Guide/matthewjetstream.rst @@ -291,7 +291,10 @@ The downloaded script runs inside the container, prints lots of status informati and creates output files in the run folder you created. Execute this command to run the simulation in your shell:: - docker run --shm-size 14G -it -v ${WORKING_DIR}:/home/wrfuser/terrestrial_data -v ${WRF_DIR}:/tmp/hurricane_matthew ${WRF_IMAGE} /tmp/hurricane_matthew/run.sh + docker run --shm-size 14G -it \ + -v ${WORKING_DIR}:/home/wrfuser/terrestrial_data \ + -v ${WRF_DIR}:/tmp/hurricane_matthew \ + ${WRF_IMAGE} /tmp/hurricane_matthew/run.sh The command has numerous arguments and options, which do the following: @@ -329,7 +332,12 @@ The analysis takes about five minutes to complete. We use command line options to tell the METplus container several things, including where the observed data is located, where the METplus configuration can be found, where the WRF output data is located, and where it should create its output files:: - docker run --rm -it --volumes-from ${OBS_DATA_VOL} -v ${METPLUS_CONFIG_DIR}:/config -v ${WORKING_DIR}/wrf:/data/input/wrf -v ${METPLUS_DIR}:/data/output ${METPLUS_IMAGE} /metplus/METplus/ush/run_metplus.py /config/PointStat_matthew.conf + docker run --rm -it \ + --volumes-from ${OBS_DATA_VOL} \ + -v ${METPLUS_CONFIG_DIR}:/config \ + -v ${WORKING_DIR}/wrf:/data/input/wrf \ + -v ${METPLUS_DIR}:/data/output ${METPLUS_IMAGE} \ + /metplus/METplus/ush/run_metplus.py /config/PointStat_matthew.conf As the analysis is performed, progress information is displayed. It is not uncommon to see "WARNING" messages in this output, and you should only be alarmed if you see messages with the text "ERROR". From 7aa55a2475ebdf00ba5de8b6f63dbfd78724c034 Mon Sep 17 00:00:00 2001 From: George McCabe <23407799+georgemccabe@users.noreply.github.com> Date: Fri, 28 Jun 2024 12:29:35 -0600 Subject: [PATCH 27/36] ignore auto-generated file --- .gitignore | 1 + 1 file changed, 1 insertion(+) diff --git a/.gitignore b/.gitignore index a271a97..7564616 100644 --- a/.gitignore +++ b/.gitignore @@ -1,2 +1,3 @@ *~ .vs +/.DS_Store From 601d40a53a9c8911854bb583fac2d8a21fdf0c2a Mon Sep 17 00:00:00 2001 From: George McCabe <23407799+georgemccabe@users.noreply.github.com> Date: Fri, 28 Jun 2024 12:31:08 -0600 Subject: [PATCH 28/36] use env var to reference obs data volume to more easily adapt to other use cases --- docs/Users_Guide/matthewjetstream.rst | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/docs/Users_Guide/matthewjetstream.rst b/docs/Users_Guide/matthewjetstream.rst index a83395c..1546d74 100644 --- a/docs/Users_Guide/matthewjetstream.rst +++ b/docs/Users_Guide/matthewjetstream.rst @@ -232,14 +232,14 @@ Once Docker is running, you must pull the correct versions of the WRF and METplu docker pull ${WRF_IMAGE} docker pull ${METPLUS_IMAGE} -The METplus analysis will be comparing the results of the WRF simulation against -the actual weather data that was recorded during Hurricane Matthew. +METplus is run to perform verification of the results of the WRF simulation using +observations gathered during Hurricane Matthew. We download that data by pulling a Docker volume that holds it, and then referencing that volume when we run the METplus Docker container. The commands to pull and create the volume are:: - docker pull ncar/iwrf:data-matthew-input-obs.docker - docker create --name ${OBS_DATA_VOL} ncar/iwrf:data-matthew-input-obs.docker + docker pull ncar/iwrf:${OBS_DATA_VOL}.docker + docker create --name ${OBS_DATA_VOL} ncar/iwrf:${OBS_DATA_VOL}.docker Get the Geographic Data ----------------------- From 76dbd969a8baa42d795c93fc4078077372fab8ef Mon Sep 17 00:00:00 2001 From: George McCabe <23407799+georgemccabe@users.noreply.github.com> Date: Fri, 28 Jun 2024 12:31:58 -0600 Subject: [PATCH 29/36] rewording and avoid using analysis to describe the METplus verification step to avoid confusion withe the METplus Analysis tools that will be added later to generate images --- docs/Users_Guide/matthewjetstream.rst | 34 +++++++++++++-------------- 1 file changed, 17 insertions(+), 17 deletions(-) diff --git a/docs/Users_Guide/matthewjetstream.rst b/docs/Users_Guide/matthewjetstream.rst index 1546d74..3422006 100644 --- a/docs/Users_Guide/matthewjetstream.rst +++ b/docs/Users_Guide/matthewjetstream.rst @@ -25,7 +25,7 @@ but a cloud computing platform can provided the needed computational power. Jetstream2 is a national cyberinfrastructure resource that is easy to use and is available to researchers and educators. This exercise runs the I-WRF programs as Docker "containers", -which simplifies the set-up work needed to run the simulation and analysis. +which simplifies the set-up work needed to run the simulation and verification. It is recommended that you follow the instructions in each section in the order presented to avoid encountering issues during the process. @@ -79,7 +79,7 @@ Create a Cloud Instance and Log In ================================== After you have logged in to Jetstream2 and added your allocation to your account, -you are ready to create the cloud instance where you will run the simulation and analysis. +you are ready to create the cloud instance where you will run the simulation and verification. If you are not familiar with the cloud computing terms "image" and "instance", it is recommended that you `read about them `__ before proceeding. @@ -157,7 +157,7 @@ Install Software and Download Data ================================== With your instance created and running and you logged in to it through SSH, -you can now install the necessary software and download the data to run the simulation and analysis. +you can now install the necessary software and download the data to run the simulation and verification. You will only need to perform these steps once, as they essentially change the contents of the instance's disk and those changes will remain even after the instance is shelved and unshelved. @@ -192,7 +192,7 @@ to redefine the variables before executing the commands that follow. Install Docker -------------- -As mentioned above, the WRF simulation and METplus analysis applications are provided as Docker images that will run as a +As mentioned above, the WRF and METplus software are provided as Docker images that will run as a `"container" `_ on your cloud instance. To run a Docker container, you must first install the Docker Engine on your instance. @@ -326,9 +326,9 @@ The output should look something like this:: Run METplus =========== -After the WRF simulation has finished, you can run the METplus analysis to compare the simulated results +After the WRF simulation has finished, you can run the METplus verification to compare the simulated results to the actual weather observations during the hurricane. -The analysis takes about five minutes to complete. +The verification takes about five minutes to complete. We use command line options to tell the METplus container several things, including where the observed data is located, where the METplus configuration can be found, where the WRF output data is located, and where it should create its output files:: @@ -339,16 +339,16 @@ where the METplus configuration can be found, where the WRF output data is locat -v ${METPLUS_DIR}:/data/output ${METPLUS_IMAGE} \ /metplus/METplus/ush/run_metplus.py /config/PointStat_matthew.conf -As the analysis is performed, progress information is displayed. It is not uncommon to see "WARNING" messages in this output, -and you should only be alarmed if you see messages with the text "ERROR". -METplus first makes two passes over each of the 48 hourly observation time-slices, -converting data files to a suitable format for the analysis. -It then performs statistical analysis on the data from the earth's surface and from the "upper air". +Progress information is displayed while the verification is performed. +**WARNING** log messages are expected because observations files are not available for every valid time and METplus is +configured to allow some missing inputs. An **ERROR** log message indicates that something went wrong. +METplus first converts the observation data files to a format that the MET tools can read using the MADIS2NC wrapper. +Point-Stat is run to generate statistics comparing METAR observations to surface-level model fields and +RAOB observations to "upper air" fields. METplus will print its completion status when the processing finishes. -The results of the METplus analysis are stored in the subfolders of ~/metplus. -Most of these files are not human readable, but those in the point_stat subfolder -contain tabular output that can be viewed in a text editor -(the rows are very long, so you may want to turn word wrapping off for better viewing). -In the near future, this exercise will be extended to include -a friendlier way to view the results from the simulation and analysis. +The results of the METplus verification can be found in ${WORKING_DIR}/metplus/point_stat. +These files contain tabular output that can be viewed in a text editor. Turn off word wrapping for better viewing. +Refer to the MET User's Guide for more information about the +`Point-Stat output `. +In the near future, this exercise will be extended to include instructions to visualize the results. From e606907b237f7fa61d91d58b65e270856aef1072 Mon Sep 17 00:00:00 2001 From: Ben Trumbore Date: Mon, 1 Jul 2024 11:23:15 -0400 Subject: [PATCH 30/36] A few formatting tweaks after the code review --- docs/Users_Guide/matthewjetstream.rst | 5 +++-- 1 file changed, 3 insertions(+), 2 deletions(-) diff --git a/docs/Users_Guide/matthewjetstream.rst b/docs/Users_Guide/matthewjetstream.rst index 3422006..8c525b7 100644 --- a/docs/Users_Guide/matthewjetstream.rst +++ b/docs/Users_Guide/matthewjetstream.rst @@ -285,6 +285,7 @@ These commands perform the necessary operations:: curl --location https://bit.ly/4eKpb47 > ${WRF_DIR}/namelist.input.template Run WRF +======= With everything in place, you are now ready to run the Docker container that will perform the simulation. The downloaded script runs inside the container, prints lots of status information, @@ -347,8 +348,8 @@ Point-Stat is run to generate statistics comparing METAR observations to surface RAOB observations to "upper air" fields. METplus will print its completion status when the processing finishes. -The results of the METplus verification can be found in ${WORKING_DIR}/metplus/point_stat. +The results of the METplus verification can be found in ``${WORKING_DIR}/metplus/point_stat``. These files contain tabular output that can be viewed in a text editor. Turn off word wrapping for better viewing. Refer to the MET User's Guide for more information about the -`Point-Stat output `. +`Point-Stat output `_. In the near future, this exercise will be extended to include instructions to visualize the results. From b441065724128ecbe0b01db12e03eaf90211b9b4 Mon Sep 17 00:00:00 2001 From: George McCabe <23407799+georgemccabe@users.noreply.github.com> Date: Mon, 1 Jul 2024 10:53:48 -0600 Subject: [PATCH 31/36] ignore auto-generated file --- docs/.DS_Store | Bin 6148 -> 0 bytes docs/.gitignore | 1 + 2 files changed, 1 insertion(+) delete mode 100644 docs/.DS_Store diff --git a/docs/.DS_Store b/docs/.DS_Store deleted file mode 100644 index e3e0d17baaa8ac1241f0e56d180c32dc045a0347..0000000000000000000000000000000000000000 GIT binary patch literal 0 HcmV?d00001 literal 6148 zcmeHK%}T>S5T0#on^J@x6g@6@E!ZC|h?h_+-i<#8m6(vA!I&*gYYwH5tGBIn^WX2)AH-2QYBt|kp;%fgSFDOvvu>PwkvSu8G){Zo@QPMv zLd5=T?)eu{n7Q?hBaw`}C<%u$APPcsxxS2&Kx92JPJ&d%I%>nJT2;5cHJP+q4Zh#) zOdEW1&~7&PUZ*>qR;|sQ-NRG&A%0B6v*MBAk(08naRx6qnM>)(878qv?$Ay#H8_V9 zLKx!c!ih{gfbi47{lkebN$v{;GpY<41`GqM%7DH=t=g&*VDdH$7zUOyK=%VjCA2l> z3gy*-gKPm1={Hge&Z(E67_QOQm@C8`6sAHERVdRf22KA%f>V75Xt8U_pl^9&T#vQG8?czORnA7oaB0mHz*Vt|$UPQQmK>Ds!G9MxKm tdW%Xze!0Sr5S++TjJ{Ngw^5~_ozn!-)|e|q3yS#>kTjUVFz}-cyaQgHjO+ja diff --git a/docs/.gitignore b/docs/.gitignore index e35d885..a1516d0 100644 --- a/docs/.gitignore +++ b/docs/.gitignore @@ -1 +1,2 @@ _build +/.DS_Store From 5264e7b2d491b5ae42c21d29f6893bd1bbb0bcf0 Mon Sep 17 00:00:00 2001 From: Ben Trumbore Date: Wed, 17 Jul 2024 11:26:22 -0400 Subject: [PATCH 32/36] Refactor instructions in preparation for edits related to changes in how WRF is run. --- docs/Users_Guide/matthewjetstream.rst | 80 ++++++++++++++++----------- 1 file changed, 49 insertions(+), 31 deletions(-) diff --git a/docs/Users_Guide/matthewjetstream.rst b/docs/Users_Guide/matthewjetstream.rst index 8c525b7..71b80f9 100644 --- a/docs/Users_Guide/matthewjetstream.rst +++ b/docs/Users_Guide/matthewjetstream.rst @@ -153,11 +153,11 @@ Increasing the number of CPUs (say, to flavor "m3.8") can make your computations But of course, doubling the number of CPUs doubles the cost per hour to run the instance, so Shelving as soon as you are done becomes even more important! -Install Software and Download Data -================================== +Preparing the Environment +========================= With your instance created and running and you logged in to it through SSH, -you can now install the necessary software and download the data to run the simulation and verification. +you can now create the run folders, install Docker software and download the data to run the simulation and verification. You will only need to perform these steps once, as they essentially change the contents of the instance's disk and those changes will remain even after the instance is shelved and unshelved. @@ -173,6 +173,9 @@ you can recover from that situation by rebooting the instance. In the Exosphere dashboard page for your instance, in the Actions menu, select "Reboot". The process takes several minutes, after which the instance status will return to "Ready". +Define Environment Variables +---------------------------- + We will be using some environment variables throughout this exercise to make sure that we use the same resource names and file paths wherever they are used. Copy and paste the definitions below into your shell to define the variables before proceeding:: @@ -189,6 +192,34 @@ Copy and paste the definitions below into your shell to define the variables bef Any time you open a new shell on your instance, you will need to perform this action to redefine the variables before executing the commands that follow. +Create the WRF and METplus Run Folders +-------------------------------------- + +The simulation is performed using a script that expects to run in a folder where it can create result files. +The first command below creates a folder (named "wrf") under the user's home directory, +and a sub-folder within "wrf" to hold the output of this simulation. +The subfolder is named "20161006_00", which is the beginning date and time of the simulatition. +Similarly, a run folder named "metplus" must be created for the METplus process to use:: + + mkdir -p ${WRF_DIR} + mkdir -p ${METPLUS_DIR} + +Download Configuration Files +---------------------------- + +Both WRF and METplus require some configuration files to direct their behavior, +and those are downloaded from the I-WRF GitHub repository. +Some of those configuration files must then be copied into run folders. +These commands perform the necessary operations:: + + git clone https://github.com/NCAR/i-wrf ${WORKING_DIR}/i-wrf + cp ${WRF_CONFIG_DIR}/namelist.* ${WRF_DIR} + cp ${WRF_CONFIG_DIR}/vars_io.txt ${WRF_DIR} + cp ${WRF_CONFIG_DIR}/run.sh ${WRF_DIR} + +Install Docker and Pull Docker Objects +====================================== + Install Docker -------------- @@ -241,8 +272,13 @@ The commands to pull and create the volume are:: docker pull ncar/iwrf:${OBS_DATA_VOL}.docker docker create --name ${OBS_DATA_VOL} ncar/iwrf:${OBS_DATA_VOL}.docker -Get the Geographic Data ------------------------ +Download Data +============= + +Text here + +Get the Data Needed by WRF +-------------------------- To run WRF on the Hurricane Matthew data set, you need a copy of the geographic data representing the terrain in the area of the simulation. @@ -254,35 +290,17 @@ They take several minutes to complete:: tar -xzf geog_high_res_mandatory.tar.gz rm geog_high_res_mandatory.tar.gz -Create the WRF and METplus Run Folders --------------------------------------- - -The simulation is performed using a script that must first be downloaded. -The script expects to run in a folder where it can download data files and create result files. -The instructions in this exercise create a folder (named "wrf") under the user's home directory, -and a sub-folder within "wrf" to hold the output of this simulation. -The subfolder is named "20161006_00", which is the beginning date and time of the simulatition. -The simulation script is called "run.sh". -Similarly, a run folder named "metplus" must be created for the METplus process to use. -The following commands create the empty folders and download the script -and change its permissions so it can be run:: - - mkdir -p ${WRF_DIR} - curl --location https://bit.ly/3xzm9z6 > ${WRF_DIR}/run.sh - chmod 775 ${WRF_DIR}/run.sh - mkdir -p ${METPLUS_DIR} +Get case study data:: -Download Configuration Files ----------------------------- + wget https://www2.mmm.ucar.edu/wrf/TUTORIAL_DATA/matthew_1deg.tar.gz + tar -xvzf matthew_1deg.tar.gz + rm -f matthew_1deg.tar.gz -Both WRF and METplus require some configuration files to direct their behavior, -and those are downloaded from the I-WRF GitHub repository. -Some of those configuration files must also be copied into run folders. -These commands perform the necessary operations:: +Get SST data:: - git clone https://github.com/NCAR/i-wrf ${WORKING_DIR}/i-wrf - cp ${WRF_CONFIG_DIR}/vars_io.txt ${WRF_DIR} - curl --location https://bit.ly/4eKpb47 > ${WRF_DIR}/namelist.input.template + wget https://www2.mmm.ucar.edu/wrf/TUTORIAL_DATA/matthew_sst.tar.gz + tar -xzvf matthew_sst.tar.gz + rm -f matthew_sst.tar.gz Run WRF ======= From e416156e39cf1fb94994d04ca188c6e929e1f118 Mon Sep 17 00:00:00 2001 From: Ben Trumbore Date: Thu, 18 Jul 2024 09:15:34 -0400 Subject: [PATCH 33/36] Add a run script and config file and update the existing config file to work with new documentation --- .../Hurricane_Matthew/WRF/namelist.input | 62 +++++++------- use_cases/Hurricane_Matthew/WRF/namelist.wps | 35 ++++++++ use_cases/Hurricane_Matthew/WRF/run.sh | 84 +++++++++++++++++++ 3 files changed, 150 insertions(+), 31 deletions(-) create mode 100644 use_cases/Hurricane_Matthew/WRF/namelist.wps create mode 100644 use_cases/Hurricane_Matthew/WRF/run.sh diff --git a/use_cases/Hurricane_Matthew/WRF/namelist.input b/use_cases/Hurricane_Matthew/WRF/namelist.input index d38b541..5f7ab52 100644 --- a/use_cases/Hurricane_Matthew/WRF/namelist.input +++ b/use_cases/Hurricane_Matthew/WRF/namelist.input @@ -3,24 +3,24 @@ run_hours = 48, run_minutes = 0, run_seconds = 0, - start_year = 2016, 2019, - start_month = 10, 09, - start_day = 06, 04, - start_hour = 00, 12, - end_year = 2016, 2019, - end_month = 10, 09, - end_day = 08, 06, - end_hour = 00, 00, - interval_seconds = 21600, - input_from_file = .true.,.true., - history_interval = 180, 60, - frames_per_outfile = 1, 1, + start_year = 2016, + start_month = 10, + start_day = 06, + start_hour = 00, + end_year = 2016, + end_month = 10, + end_day = 08, + end_hour = 0, + interval_seconds = 21600 + input_from_file = .true., + history_interval = 180, + frames_per_outfile = 1, restart = .false., restart_interval = 1440, - io_form_history = 2, - io_form_restart = 2, - io_form_input = 2, - io_form_boundary = 2, + io_form_history = 2 + io_form_restart = 2 + io_form_input = 2 + io_form_boundary = 2 iofields_filename = "vars_io.txt", "vars_io.txt", auxhist22_outname = "wrfout_zlev_d_", auxhist22_interval = 180, 180, @@ -33,16 +33,16 @@ / &domains - time_step = 90, + time_step = 150, time_step_fract_num = 0, time_step_fract_den = 1, max_dom = 1, - e_we = 91, 220, - e_sn = 100, 214, - e_vert = 45, 45, + e_we = 91, + e_sn = 100, + e_vert = 45, dzstretch_s = 1.1 p_top_requested = 5000, - num_metgrid_levels = 32, + num_metgrid_levels = 32 num_metgrid_soil_levels = 4, dx = 27000, dy = 27000, @@ -105,17 +105,17 @@ / &namelist_quilt - nio_tasks_per_group = 0, - nio_groups = 1, + nio_tasks_per_group = 0, + nio_groups = 1, / &diags - z_lev_diags = 1, - num_z_levels = 6, - z_levels = -80,-100,-200,-300,-400,-500 - p_lev_diags = 1, - num_press_levels = 10, - press_levels = 92500,85000,70000,50000,40000,30000,25000,20000,15000,10000 - use_tot_or_hyd_p = 1, - solar_diagnostics = 0, + z_lev_diags = 1, + num_z_levels = 6, + z_levels = -80,-100,-200,-300,-400,-500 + p_lev_diags = 1, + num_press_levels = 10, + press_levels = 92500,85000,70000,50000,40000,30000,25000,20000,15000,10000 + use_tot_or_hyd_p = 1, + solar_diagnostics = 0, / diff --git a/use_cases/Hurricane_Matthew/WRF/namelist.wps b/use_cases/Hurricane_Matthew/WRF/namelist.wps new file mode 100644 index 0000000..f3408a6 --- /dev/null +++ b/use_cases/Hurricane_Matthew/WRF/namelist.wps @@ -0,0 +1,35 @@ +&share + wrf_core = 'ARW', + max_dom = 1, + start_date = '2016-10-06_00:00:00', + end_date = '2016-10-08_00:00:00', + interval_seconds = 21600 +/ + +&geogrid + parent_id = 1, + parent_grid_ratio = 1, + i_parent_start = 1, + j_parent_start = 1, 25, + e_we = 91, + e_sn = 100, + geog_data_res = 'default', + dx = 27000, + dy = 27000, + map_proj = 'mercator', + ref_lat = 28.00, + ref_lon = -75.00, + truelat1 = 30.0, + truelat2 = 60.0, + stand_lon = -75.0, + geog_data_path = '/home/wrfuser/terrestrial_data/WPS_GEOG' +/ + +&ungrib + out_format = 'WPS', + prefix = 'FILE', +/ + +&metgrid + fg_name = 'FILE' +/ diff --git a/use_cases/Hurricane_Matthew/WRF/run.sh b/use_cases/Hurricane_Matthew/WRF/run.sh new file mode 100644 index 0000000..ba0d963 --- /dev/null +++ b/use_cases/Hurricane_Matthew/WRF/run.sh @@ -0,0 +1,84 @@ +#! /bin/bash + +# script adapted from instructions at https://www2.mmm.ucar.edu/wrf/OnLineTutorial/CASES/SingleDomain/ungrib.php +# docker run -it -v /home/hahn/git:/home/wrfuser/git -v /mnt/storage/terrestrial_data:/home/wrfuser/terrestrial_data iwrf:latest /bin/bash + +source /etc/bashrc + +CYCLE_DIR="/tmp/hurricane_matthew" +WPS_DIR="/home/wrfuser/WPS" +WRF_DIR="/home/wrfuser/WRF" + +function main +{ + mkdir -p "${CYCLE_DIR}" + download_case_study_data + link_gfs_vtable + run_ungrib + download_sst_data + run_geogrid + run_metgrid + run_real + run_wrf +} + + +function download_case_study_data +{ + wget https://www2.mmm.ucar.edu/wrf/TUTORIAL_DATA/matthew_1deg.tar.gz + tar -xvzf matthew_1deg.tar.gz + rm -f matthew_1deg.tar.gz +} + + +function link_gfs_vtable +{ + ln -sf "${WPS_DIR}/ungrib/Variable_Tables/Vtable.GFS" Vtable + ${WPS_DIR}/link_grib.csh "${CYCLE_DIR}/matthew/*.grib2" +} + + +function run_ungrib +{ + ln -s "${WPS_DIR}/ungrib.exe" . 2>/dev/null + ./ungrib.exe +} + + +function download_sst_data +{ + wget https://www2.mmm.ucar.edu/wrf/TUTORIAL_DATA/matthew_sst.tar.gz + tar -xzvf matthew_sst.tar.gz + rm -f matthew_sst.tar.gz +} + + +function run_geogrid +{ + ln -s "${WPS_DIR}"/* . 2>/dev/null + ./geogrid.exe +} + + +function run_metgrid +{ + ./metgrid.exe +} + + +function run_real +{ + ln -s "${WRF_DIR}"/test/em_real/* . 2>/dev/null + ./real.exe +} + + +function run_wrf +{ + ulimit -s unlimited + ln -s "${WRF_DIR}"/test/em_real/* . 2>/dev/null + mpirun ./wrf.exe +} + + +main From b23b29710a29aea6a1c0d39bbb5fb0dc5f8ccd68 Mon Sep 17 00:00:00 2001 From: Ben Trumbore Date: Thu, 18 Jul 2024 09:29:47 -0400 Subject: [PATCH 34/36] change run.sh permissions --- use_cases/Hurricane_Matthew/WRF/run.sh | 0 1 file changed, 0 insertions(+), 0 deletions(-) mode change 100644 => 100755 use_cases/Hurricane_Matthew/WRF/run.sh diff --git a/use_cases/Hurricane_Matthew/WRF/run.sh b/use_cases/Hurricane_Matthew/WRF/run.sh old mode 100644 new mode 100755 From 62121c3b1a79bb49261cbaad66ee173c93f22403 Mon Sep 17 00:00:00 2001 From: Ben Trumbore Date: Thu, 18 Jul 2024 09:56:13 -0400 Subject: [PATCH 35/36] Remove data downloading from run.sh --- use_cases/Hurricane_Matthew/WRF/run.sh | 26 +------------------------- 1 file changed, 1 insertion(+), 25 deletions(-) diff --git a/use_cases/Hurricane_Matthew/WRF/run.sh b/use_cases/Hurricane_Matthew/WRF/run.sh index ba0d963..0e98468 100755 --- a/use_cases/Hurricane_Matthew/WRF/run.sh +++ b/use_cases/Hurricane_Matthew/WRF/run.sh @@ -12,67 +12,44 @@ WRF_DIR="/home/wrfuser/WRF" function main { mkdir -p "${CYCLE_DIR}" - download_case_study_data + cd "${CYCLE_DIR}" link_gfs_vtable run_ungrib - download_sst_data run_geogrid run_metgrid run_real run_wrf } - -function download_case_study_data -{ - wget https://www2.mmm.ucar.edu/wrf/TUTORIAL_DATA/matthew_1deg.tar.gz - tar -xvzf matthew_1deg.tar.gz - rm -f matthew_1deg.tar.gz -} - - function link_gfs_vtable { ln -sf "${WPS_DIR}/ungrib/Variable_Tables/Vtable.GFS" Vtable ${WPS_DIR}/link_grib.csh "${CYCLE_DIR}/matthew/*.grib2" } - function run_ungrib { ln -s "${WPS_DIR}/ungrib.exe" . 2>/dev/null ./ungrib.exe } - -function download_sst_data -{ - wget https://www2.mmm.ucar.edu/wrf/TUTORIAL_DATA/matthew_sst.tar.gz - tar -xzvf matthew_sst.tar.gz - rm -f matthew_sst.tar.gz -} - - function run_geogrid { ln -s "${WPS_DIR}"/* . 2>/dev/null ./geogrid.exe } - function run_metgrid { ./metgrid.exe } - function run_real { ln -s "${WRF_DIR}"/test/em_real/* . 2>/dev/null ./real.exe } - function run_wrf { ulimit -s unlimited @@ -80,5 +57,4 @@ function run_wrf mpirun ./wrf.exe } - main From 6e894e83c585476cc29bca941673dd7d1ba11f28 Mon Sep 17 00:00:00 2001 From: Ben Trumbore Date: Thu, 18 Jul 2024 10:17:22 -0400 Subject: [PATCH 36/36] Finalize edits of the tutorial to use config files from new location and add data download steps. --- docs/Users_Guide/matthewjetstream.rst | 49 ++++++++++++++------------- 1 file changed, 26 insertions(+), 23 deletions(-) diff --git a/docs/Users_Guide/matthewjetstream.rst b/docs/Users_Guide/matthewjetstream.rst index 71b80f9..58e868b 100644 --- a/docs/Users_Guide/matthewjetstream.rst +++ b/docs/Users_Guide/matthewjetstream.rst @@ -177,7 +177,7 @@ Define Environment Variables ---------------------------- We will be using some environment variables throughout this exercise to -make sure that we use the same resource names and file paths wherever they are used. +make sure that we refer to the same resource names and file paths wherever they are used. Copy and paste the definitions below into your shell to define the variables before proceeding:: WRF_IMAGE=ncar/iwrf:latest @@ -198,7 +198,7 @@ Create the WRF and METplus Run Folders The simulation is performed using a script that expects to run in a folder where it can create result files. The first command below creates a folder (named "wrf") under the user's home directory, and a sub-folder within "wrf" to hold the output of this simulation. -The subfolder is named "20161006_00", which is the beginning date and time of the simulatition. +The subfolder is named "20161006_00", which is the beginning date and time of the simulation. Similarly, a run folder named "metplus" must be created for the METplus process to use:: mkdir -p ${WRF_DIR} @@ -209,7 +209,7 @@ Download Configuration Files Both WRF and METplus require some configuration files to direct their behavior, and those are downloaded from the I-WRF GitHub repository. -Some of those configuration files must then be copied into run folders. +Some of those configuration files are then copied into the run folders. These commands perform the necessary operations:: git clone https://github.com/NCAR/i-wrf ${WORKING_DIR}/i-wrf @@ -236,6 +236,7 @@ then installs Docker:: curl --location https://bit.ly/3R3lqMU > install-docker.sh source install-docker.sh + rm install-docker.sh If a text dialog is displayed asking which services should be restarted, type ``Enter``. When the installation is complete, you can verify that the Docker command line tool works by asking for its version:: @@ -272,35 +273,37 @@ The commands to pull and create the volume are:: docker pull ncar/iwrf:${OBS_DATA_VOL}.docker docker create --name ${OBS_DATA_VOL} ncar/iwrf:${OBS_DATA_VOL}.docker -Download Data -============= +Download Data for WRF +===================== -Text here +To run WRF on the Hurricane Matthew data set, you need to have +several data sets to support the computation. +The commands in these sections download archive files containing that data, +then uncompress the archives into folders. +The geographic data is large and takes several minutes to acquire, +while the other two data sets are smaller and are downloaded directly into the WRF run folder, +rather than the user's home directory. -Get the Data Needed by WRF --------------------------- - -To run WRF on the Hurricane Matthew data set, you need a copy of the -geographic data representing the terrain in the area of the simulation. -These commands download an archive file containing that data, -uncompress the archive into a folder named "WPS_GEOG" in your home directory, and delete the archive file. -They take several minutes to complete:: +Get the geographic data representing the terrain in the area of the simulation:: + cd ${WORKING_DIR} wget https://www2.mmm.ucar.edu/wrf/src/wps_files/geog_high_res_mandatory.tar.gz tar -xzf geog_high_res_mandatory.tar.gz rm geog_high_res_mandatory.tar.gz -Get case study data:: +Get the case study data (GRIB2 files):: - wget https://www2.mmm.ucar.edu/wrf/TUTORIAL_DATA/matthew_1deg.tar.gz - tar -xvzf matthew_1deg.tar.gz - rm -f matthew_1deg.tar.gz + cd ${WRF_DIR} + wget https://www2.mmm.ucar.edu/wrf/TUTORIAL_DATA/matthew_1deg.tar.gz + tar -xvzf matthew_1deg.tar.gz + rm -f matthew_1deg.tar.gz -Get SST data:: +Get the SST (Sea Surface Temperature) data:: - wget https://www2.mmm.ucar.edu/wrf/TUTORIAL_DATA/matthew_sst.tar.gz - tar -xzvf matthew_sst.tar.gz - rm -f matthew_sst.tar.gz + cd ${WRF_DIR} + wget https://www2.mmm.ucar.edu/wrf/TUTORIAL_DATA/matthew_sst.tar.gz + tar -xzvf matthew_sst.tar.gz + rm -f matthew_sst.tar.gz Run WRF ======= @@ -325,7 +328,7 @@ The command has numerous arguments and options, which do the following: The simulation initially prints lots of information while initializing things, then settles in to the computation. The provided configuration simulates 48 hours of weather and takes about 12 minutes to finish on an m3.quad Jetstream2 instance. -Once completed, you can view the end of any of the output files to confirm that it succeeded:: +Once completed, you can view the end of an output file to confirm that it succeeded:: tail ${WRF_DIR}/rsl.out.0000