ETL job to extract data from a iSante Plus database into a CSV format file ready to be uploaded on a iSante instance.
This instructions will help you run the Pentaho Data Integration job that generates the CSV file.
- Download and install Pentaho Data Integration (PDI) version 8 (follow PDI instalation procedure, requires JAVA). This steps were validated on version 8.2.0;
- Pentaho DI Community edition can be found here.
- Download MySQL connector version 5;
- Install Curl;
- Extract the downloaded file on a directory of your choice
${pdi-install-dir}
; - Copy MySQL connector jar file to
${pdi-install-dir}/data-integration/lib
; - Open terminal on directory
${pdi-install-dir}/data-integration
and execute./spoon.sh
; - Close PDI end edit the file
~/.kettle/kettle.properties
adding the required database variables, you can get inspiration from the file https://github.com/edrisse/isante/blob/master/etc/sample_kettle.properties on this repository; - edit the file
isante/etc/uploadAllfiles.sh
addinng the consolidated server IP - Run https://github.com/edrisse/isante/blob/master/sql/iSante-plus.sql SQL script on your iSante plus database;
- Open PDI again and click open (Ctrl+O) and open the
extract-data.kjb
job; - Click
Run
(with a play icon); - Disable
Gather performance statistics
option to make the execution faster; - Click
Run
; - Once the execution of the job completes the result will be stored on the file
~/isante/data-post-processed.csv
;