This simple Python script is thought to make it easier to export SRS data for debugging and analysis purposes.
The script runs a query against the raw or the aggregated database. As the raw database is concerned, by default it queries the single_data table, but using the --old
(or -O
) option it can be set to use the single_data_old table instead.
The current table is queried by using the --aggregate
flag.
It requires the following environmental variables set:
- SRS_EXPORTER_HOST (default is 127.0.0.1)
- SRS_EXPORTER_DB_RAW (default is srs_raw_db)
- SRS_EXPORTER_DB_AGG (default is srs_agg_db)
- SRS_EXPORTER_USER (default is postgres)
- SRS_EXPORTER_PASS (default is postgres)
- SRS_EXPORTER_PORT (default is 5432)
The script's usage is quite straight forward:
Usage:
exporter.py [OPTIONS]
Usage examples:
exporter.py -l 10
exporter.py --longitude 12.92290593 --latitude 43.74830223 -d 1000
exporter.py -T 56237 --meta
exporter.py -a -l 10
exporter.py -O -A 2018-04-01 --before 2018-04-02
Options:
-d --distance <int> Distance in meters used for range queries
(Default:100).
-t --latitude <float> Latitude used for range queries.
-g --longitude <float> Longitude used for range queries.
-A --after <datetime> Selected rows have to be created after this
specific datetime value.
-B --before <datetime> Selected rows have to be created before this
specific datetime value.
-T --track <track_id> Track id of selected rows.
-m --meta If specified rows will be exported with their
track's metadata (NOTE: cause a JOIN).
-o --output <filename> Filename/path where results have to be written.
-a --aggregate If specified data will be queried from
the "current" table in the aggregate database.
-O --Old If specified data will be queried from
the "single_data_old" table in the raw database.
-l --limit <int> Limit If specified data will be queried from
the "single_data_old" table in the raw database.
--debug Print debug information.
-h --help Print this help.
Please consider using this script within a Docker container (see: Docker smartroadsense/dataexporter image)
When used within the Docker container linked above please note that the script will be in the /tmp/
directory and should be executed in this way in order to output results in the shared /data
docker volume:
$ python exporter.py -o /data/data.csv [OTHER OPTIONS]