Skip to content

Releases: CrunchyData/crunchy-demo-data

V0.6.5.1

27 Aug 01:51
Compare
Choose a tag to compare

NOTE:
This data is only the data for the fire scenario (weather and fire locations). You also have to:
create extension postgis
as a privlieged user in your DB BEFORE running the import below.

Since the fire data was too big to commit, only the dump of the tables for fire boundaries, regions, and units contains the data. To recreate using the information in the README.md you need to re-download the data from the sites mentioned and follow the steps.


Data was exported with:

pg_dump -h localhost -U postgres -Fp --compress=9 --no-owner --no-privileges -d fire -t 'fire19*' -t 'weather' -f crunchy-demo-fire.dump.sql.gz

imports with

gunzip -c crunchy-demo-fire.dump.sql.gz |psql -h localhost -U postgres -p 5432 fire


This release fixes an issue with dates in the fire19 table where they were text fields and not date fields

Adding fire and weather data

20 Aug 21:33
Compare
Choose a tag to compare

NOTE:
This data is only the data for the fire scenario (weather and fire locations). You also have to:
create extension postgis
as a privlieged user in your DB BEFORE running the import below.

Since the fire data was too big to commit, only the dump of the tables for fire boundaries, regions, and units contains the data. To recreate using the information in the README.md you need to re-download the data from the sites mentioned and follow the steps.

Data was exported with:

pg_dump -h localhost -U postgres -Fp --compress=9 --no-owner --no-privileges -d fire -t 'fire19*' -t 'weather'  -f crunchy-demo-fire.dump.sql.gz

imports with

gunzip -c crunchy-demo-fire.dump.sql.gz |psql -h localhost -U postgres -p 5432 fire

More permission and schema improvements

15 Jul 20:05
Compare
Choose a tag to compare

We no longer support individual csv files with instructions on how to load each file. This data was dumped from a PostgreSQL 11 database.

This release did some minor improvements to permissions and moved all tables into the public schema.

Here is the command we used to export the data:

pg_dump -h localhost -U postgres -Fp --compress=9 --no-owner --no-privileges -d workshop -f crunchy-demo-data.dump.sql.gz

and here is an example of the command you can use to restore the database

gunzip -c crunchy-demo-data.dump.sql.gz |psql -h localhost -U postgres -p 5432 workshop

If you use a non-privileged user you may have trouble loading the data since the exported database uses PostGIS. Make sure to add PostGIS to your database before importing the data

A better dump file

30 Jun 00:40
Compare
Choose a tag to compare

We no longer support individual csv files with instructions on how to load each file. This data was dumped from a PostgreSQL 11 database.

This release got rid of the owner and grant/revoke information making it easier to load into any database without being stuck with errors or unwanted permissions.

Here is the command we used to export the data:

pg_dump -h localhost -U postgres -Fp --compress=9 --no-owner --no-privileges -d workshop -f crunchy-demo-data.dump.sql.gz

and here is an example of the command you can use to restore the database

gunzip -c crunchy-demo-data.dump.sql.gz |psql -h localhost -U postgres -p 5432 workshop

Moving to postgresql dump file

31 Dec 01:45
Compare
Choose a tag to compare

This release is the beginning of our new download format - pg_dump files. We no longer support individual csv files with instructions on how to load each file. This data was dumped from a PostgreSQL 11 database.

Here is the command we used

 pg_dump -h localhost -U postgres -Fp --compress=9 -d workshop -f crunchy-demo-data.dump.sql.gz

and here is an example of the command you can use to restore the database

gunzip -c crunchy-demo-data.dump.sql.gz |psql -h localhost -U postgres -p 5432  workshop

We recommend using the postgres user as this will help avoid permission issues when trying to import the data.

Adding Natural Events data from NASA

02 Aug 23:19
Compare
Choose a tag to compare
Merge remote-tracking branch 'origin/master'

# Conflicts:
#	.gitignore

Moved from geometry to geography in postgis

08 Apr 21:43
Compare
Choose a tag to compare

Also fixing minor typos in names and such

Initial Release

03 Apr 02:43
Compare
Choose a tag to compare

This data focuses on Public Domain or Unencumbered data so you can use freely for commercial or non-commercial purposes. To easily meet this requirement we took U.S.A. County level data. There is spatial, free-text, JSON, and Key-Value data included.

Inside the released ZIP file a directory per dataset. Each directory contains:

  1. The DDL to make the tables and indices. The end of the file will have a \copy command to load the data.
  2. A text file that works with the \copy command to populate the tables
  3. The codebook for the data provided by the original supplier of the raw data.

This release only contains the currently processed datasets:

County Boundaries
County Typology
Storm Data
Wikipedia