Skip to content

Commit 1dd0b87

Browse files
committed
further edits, removing reference to curl
1 parent ba6dc6e commit 1dd0b87

2 files changed

Lines changed: 9 additions & 7 deletions

File tree

Downloading_ECCO_datasets_from_PODAAC/Tutorial_wget_Command_Line_HTTPS_Downloading_ECCO_Datasets_from_PODAAC.md

Lines changed: 9 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -37,24 +37,24 @@ machine urs.earthdata.nasa.gov
3737
$ chmod 0600 ~/.netrc
3838
```
3939

40-
3. Create an ```urs_cookies``` "cookie" file. This will be used to persist sessions across individual cURL/Wget calls, making it more efficient.
40+
3. Create an ```urs_cookies``` "cookie" file. This will be used to persist sessions across individual _wget_ calls, making it more efficient.
4141

4242
```
4343
> cd ~
4444
> touch .urs_cookies
4545
```
4646

47-
4847
## Step 3: Prepare a list of granules (files) to download
4948

50-
Now the only step that remains is to get a list of URLs to pass to *wget* or *curl* for downloading. There's a lot of ways to do this -- even more so for ECCO datasets data because the files/datasets follow well-structured naming conventions -- but we will rely on Earthdata Search to do this from the browser for the sake of simplicity.
49+
Now the only step that remains is to get a list of URLs to pass to *_wget_* for downloading. There's a lot of ways to do this -- even more so for ECCO datasets data because the files/datasets follow well-structured naming conventions -- but we will rely on Earthdata Search to do this from the browser for the sake of simplicity.
5150

5251
**1. Find the collection/dataset of interest in Earthdata Search.**
5352

5453
Start from this [complete list of ECCO collections](https://search.earthdata.nasa.gov/portal/podaac-cloud/search?fpj=ECCO) in Earthdata Search, and refine the results until you see your dataset of interest.
5554

5655
In this example we will download all of the granules for the collection [ECCO Version 4 Release 4 (V4r4) monthly sea surface height on a 0.5 degree lat-lon grid](https://search.earthdata.nasa.gov/portal/podaac-cloud/search/granules?p=C1990404799-POCLOUD).
5756

57+
5858
**2. Choose your collection, then click the green *Download All* button on the next page.**
5959

6060
Click the big green button identified by the red arrow/box in the screenshot below.
@@ -63,16 +63,18 @@ Click the big green button identified by the red arrow/box in the screenshot bel
6363

6464
That will add all the granules in the collection to your "shopping cart" and then redirect you straight there and present you with the available options for customizing the data prior to download. In this example we ignore the other download options those because they are in active development.
6565

66-
<img src="https://raw.githubusercontent.com/ECCO-GROUP/ECCO-ACCESS/master/PODAAC/Images/edsc2.png" width="70%" />
67-
<center><i>The screenshot above shows the download customization interface (i.e. "shopping cart")</i></center>
68-
6966
**3. Click *Download Data* to get your list of download urls (bottom-left, another green button)**
7067

7168
The *Download Data* button takes you to one final page that provides the list of urls from which to download the files matching your search parameters and any customization options that you selected in the steps that followed. This page will be retained in your User History in case you need to return to it later.
7269

70+
<img src="https://raw.githubusercontent.com/ECCO-GROUP/ECCO-ACCESS/master/PODAAC/Images/edsc2.png" width="70%" />
71+
<center><i>The screenshot above shows the download customization interface (i.e. "shopping cart")</i></center>
72+
73+
There are several ways that you could get the list of urls into a text file that's accessible from Jupyter or your local shell. I simply clicked the **Save** button in my browser and downloaded them as a text file. (You could also copy them into a new notebook cell and write them to a file like we did with the ```netrc``` file above.).
74+
7375
<img src="https://raw.githubusercontent.com/ECCO-GROUP/ECCO-ACCESS/master/PODAAC/Images/edsc3.png" width="70%" />
7476

75-
There are several ways that you could get the list of urls into a text file that's accessible from Jupyter or your local shell. I simply clicked the **Save** button in my browser and downloaded them as a text file. (You could also copy them into a new notebook cell and write them to a file like we did with the ```netrc``` file above.). As of 2021-06-25 the option "Download Script" also produces a functioning script for batch downloading.
77+
> **Note!** _As of 2021-06-25 the option "Download Script" also produces a functioning script for batch downloading._)
7678
7779
### Step 3: Download files in a batch with GNU *_wget_*
7880

0 commit comments

Comments
 (0)