Difference between revisions of "Backup"
From PhotoVoltaic Logger new generation
(Created page with "== Database backup == A full database backup is quite simple. $ mysqldump -u [username] -p[password] -h [host] [databaseName] >PVLng-$(date +"Y-m-d").sql It just dumps all...") |
m |
||
Line 12: | Line 12: | ||
$ GUID=your_GUID_here | $ GUID=your_GUID_here | ||
− | <nowiki>$ wget - | + | <nowiki>$ wget -O PVLng-$GUID-$(date +"Y-m-d").csv http://your.domain.here/api/r3/data/$GUID.csv?start=0&full=1</nowiki> |
* <tt>start=0</tt> - all data since 1970... | * <tt>start=0</tt> - all data since 1970... | ||
* <tt>full=1</tt> - extract also the readable date+time column | * <tt>full=1</tt> - extract also the readable date+time column | ||
+ | |||
+ | As script: | ||
+ | |||
+ | #!/bin/sh | ||
+ | |||
+ | ### Configure these 3 parameters: | ||
+ | host="<nowiki>http://your.domain.here</nowiki>" # WITH http:// | ||
+ | apikey="..." # required for private channels | ||
+ | destination=~ | ||
+ | # destination=/backup/PVLng/$(date +"Y-m-d") | ||
+ | |||
+ | ### Let's go | ||
+ | test -d "$destination" || mkdir -p "$destination" | ||
+ | |||
+ | echo $(date) Start backup ... | ||
+ | |||
+ | IFS=";" | ||
+ | |||
+ | ### Read all real channels, but WITHOUT RANDOM channels | ||
+ | wget -qO - $host/api/r3/channels.csv | grep -v "RANDOM " | \ | ||
+ | while read id GUID name dummy; do | ||
+ | ### remove text delimiters | ||
+ | name=$(echo "$name" | sed -e '~"~~g') | ||
+ | |||
+ | echo -n "$(date) $GUID - $name ... " | ||
+ | wget -q --header="X-PVLng-Key: $apikey" -O $destination/$GUID-$(date +"Y-m-d").csv $host/api/r3/data/$GUID.csv?start=0&full=1 | ||
+ | echo Done. | ||
+ | done | ||
+ | |||
+ | echo $(date) Finished. | ||
[[Category:HowTo]] | [[Category:HowTo]] | ||
[[Category:Example]] | [[Category:Example]] |
Revision as of 15:52, 6 March 2014
Database backup
A full database backup is quite simple.
$ mysqldump -u [username] -p[password] -h [host] [databaseName] >PVLng-$(date +"Y-m-d").sql
It just dumps all data into a date stamped SQL file.
Backup with API
If you are interested in CSV/TSV data as backup, you can do this by requesting all data for a channel via the API
$ GUID=your_GUID_here $ wget -O PVLng-$GUID-$(date +"Y-m-d").csv http://your.domain.here/api/r3/data/$GUID.csv?start=0&full=1
- start=0 - all data since 1970...
- full=1 - extract also the readable date+time column
As script:
#!/bin/sh ### Configure these 3 parameters: host="http://your.domain.here" # WITH http:// apikey="..." # required for private channels destination=~ # destination=/backup/PVLng/$(date +"Y-m-d") ### Let's go test -d "$destination" || mkdir -p "$destination" echo $(date) Start backup ... IFS=";" ### Read all real channels, but WITHOUT RANDOM channels wget -qO - $host/api/r3/channels.csv | grep -v "RANDOM " | \ while read id GUID name dummy; do ### remove text delimiters name=$(echo "$name" | sed -e '~"~~g') echo -n "$(date) $GUID - $name ... " wget -q --header="X-PVLng-Key: $apikey" -O $destination/$GUID-$(date +"Y-m-d").csv $host/api/r3/data/$GUID.csv?start=0&full=1 echo Done. done echo $(date) Finished.