Skip to content

Latest commit

 

History

History
119 lines (73 loc) · 7.29 KB

README.md

File metadata and controls

119 lines (73 loc) · 7.29 KB

modis-download

This repository contains instructions and examples of how to batch download MODIS files.

MODIS Basics

"MODIS (or Moderate Resolution Imaging Spectroradiometer) is a key instrument aboard the Terra (originally known as EOS AM-1) and Aqua (originally known as EOS PM-1) satellites ... It has a viewing swath width of 2,330 km and views the entire surface of the Earth every one to two days. Its detectors measure 36 spectral bands between 0.405 and 14.385 µm, and it acquires data at three spatial resolutions -- 250m, 500m, and 1,000m." (NASA)

MODIS provides numerous land, ocean, and atmosphere products. Here we will focus on MODIS land products, but similar procedures can likely be followed for other products. A list of MODIS products can be found here.

A helpful overview of MODIS products can be found here. This link is particularly helpful for understanding the naming convention for MODIS products.

Downloading MODIS data

There are a number of different ways to access the MODIS data. This is just one method that I prefer.

Step 1: Create/Log-in to NASA Earthdata

If you do not have a NASA Earthdata account, then you must first register here.

If you already created an account, login.

Step 2: Find the data you want

The next step is to use the Earthdata search tool to find the data product you want.

You can get to the search tool here: https://search.earthdata.nasa.gov/search or you can also get to it from the Earthdata homepage by clicking "Find Data."

In the top left corner of the webpage, type in the MODIS product name and a list of matching collections will appear to the right. Below is an example using the MOD13A3 product (monthly vegetation indices).

hi

Once you click on one of the matching collections, you should see a full list of all the product files. Below is what happens after we click on "MODIS/Terra Vegetation Indices Monthly L3 Global 1km SIN Grid V006" from the list of matching collections.

hi

We see from the example above that there are 75,538 "granules" (i.e., data files) for this particular collection of the MOD13A3 data product. In order to narrow this down to the desired spatial/temporal scale, we need to filter the results in the next step (Step 3).

Step 3: Filter the results

If you want the data for a particular time period, you can do this by either clicking on the small calendar icon, or using the "Temporal" filter tool under "Filter Granules" on the left side. For example, we can set the start and stop time such that only files for the year 2016 are selected (see below).

hi

Click "apply" and you should now see only 3,493 granules returned from the search results.

hi

Next, you could apply a spatial filter if desired. One way to do this is to use the spatial extent selection tool (see below).

hi

You can use one of the geometric selection tools on the map to visually select the area of interest. For example, here we use the Rectangle tool to select the Los Angeles basin.

hi

(Tip: you can collapse/expand the search results panel to make the viewable area of the map larger.)

hi

Step 4, Method 1: Download the data using NASA-provided script

After applying a spatial and temporal filter for our MOD13A3 example, we see that we are left with 12 granules. It's a good idea to check at this point whether or not the number of granules makes sense. Since the LA Basin is fully enclosed within a single tile area, and our data product (MOD13A3) is a monthly product; the final granule count of 12 makes sense. But for example, if the spatial extent of our area of interest was split between two MODIS tiles, we would've ended up with 24 granules instead.

hi

When you are happy with the filtering, click "Download All." A new panel will appear; click "Download Data" (see below).

hi

You will then be re-directed to a Download Status page, where you have a couple of options to download the files. At this point, it is a good idea to check that the list of files is correct in case a bug in the system is generating an incorrect list of file URLs. We see below in our example that there are 12 granulaes, and from the file names we can see that the correct product (MOD13A3) and tile area (h08v05) are being used.

hi

If you only have a few files, it could make sense to manually download each file. If you have a substantial number of files, you will likely want to use the Download Script that is provided by the website.

hi

Instructions for both Linux and Windows environments are provided on the website for further details. Here we show how the examples files are downloaded within a LINUX environment.

First click "Save" and download the shell script file in your preferred directory. You will noticed the shell script file has a funny name. Rename it to "download.sh".

You must change the permissions of the file to run the shell script. First change the directory to where your file is located. Then execute:

chmod 777 download.sh

Then run the script:

./download.sh

You will be prompted to enter your username and password for your Earthdata account. Once you enter the credentials correctly, the download should commence in the current directory.

hi

Step 4, Method 2: Download the data using wget, curl, or aria2c

For this method, you will need to save the list of URLs as a text file (see below), and then use your tool of choice (wget, curl, aria2c) to download the files.

hi

Before downloading, you need to modify the .netrc file in your home directory to include your NASA Earthdata login information.

Your .netrc file should contain the following:

machine urs.earthdata.nasa.gov
login [insert username]
password [insert password]

Once you have the authentication information in your .netrc file, you should be able to use the tool of your choice to download the files. Below is an example of how you might use aria2c. Assuming that you named the text file "urls.txt" and that you placed it in the current directory:

aria2c -i urls.txt -j8

The "-j8" flag indicates that 8 files will be downloaded concurrently. This is helpful for downloading larger batches of files.

Examples of how to use curl or wget:

xargs -n 1 curl -O < urls.txt
wget -i urls.txt