Difference between revisions of "Backup"
From PhotoVoltaic Logger new generation
(Created page with "== Database backup == A full database backup is quite simple. $ mysqldump -u [username] -p[password] -h [host] [databaseName] >PVLng-$(date +"Y-m-d").sql It just dumps all...") |
m (→Backup with API) |
||
(6 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
+ | [[Category:HowTo]] | ||
+ | [[Category:Example]] | ||
== Database backup == | == Database backup == | ||
A full database backup is quite simple. | A full database backup is quite simple. | ||
− | + | # mysqldump -u [username] -p[password] -h [host] [databaseName] >PVLng-$(date +"Y-m-d").sql | |
It just dumps all data into a date stamped SQL file. | It just dumps all data into a date stamped SQL file. | ||
Line 11: | Line 13: | ||
If you are interested in CSV/TSV data as backup, you can do this by requesting all data for a channel via the API | If you are interested in CSV/TSV data as backup, you can do this by requesting all data for a channel via the API | ||
− | + | # GUID=your_GUID_here | |
− | <nowiki> | + | <nowiki># wget -O PVLng-$GUID-$(date +"Y-m-d").csv http://your.domain.here/api/latest/data/$GUID.csv?start=0&full=1</nowiki> |
* <tt>start=0</tt> - all data since 1970... | * <tt>start=0</tt> - all data since 1970... | ||
− | * <tt>full=1</tt> - extract also the readable date+time column | + | * <tt>full=1 </tt> - extract also the readable date+time column |
− | + | As script: | |
− | + | ||
+ | #!/bin/sh | ||
+ | |||
+ | ### Configure these 3 parameters: | ||
+ | host="<nowiki>http://your.domain.here</nowiki>" # WITH http:// | ||
+ | apikey="..." # required for private channels | ||
+ | destination=~ | ||
+ | # destination=/backup/PVLng/$(date +"Y-m-d") | ||
+ | |||
+ | ### Let's go | ||
+ | test -d "$destination" || mkdir -p "$destination" | ||
+ | |||
+ | echo $(date) Start backup ... | ||
+ | |||
+ | IFS=";" | ||
+ | |||
+ | ### Read all real channels, but WITHOUT RANDOM channels | ||
+ | wget -qO - $host/api/latest/channels.csv | grep -v "RANDOM " | \ | ||
+ | while read id GUID name dummy; do | ||
+ | ### remove text delimiters | ||
+ | name=$(echo "$name" | sed -e '~"~~g') | ||
+ | |||
+ | echo -n "$(date) $GUID - $name ... " | ||
+ | wget -q --header="X-PVLng-Key: $apikey" -O $destination/$GUID-$(date +"Y-m-d").csv $host/api/latest/data/$GUID.csv?start=0&full=1 | ||
+ | echo Done. | ||
+ | done | ||
+ | |||
+ | echo $(date) Finished. |
Latest revision as of 16:43, 28 September 2014
Database backup
A full database backup is quite simple.
# mysqldump -u [username] -p[password] -h [host] [databaseName] >PVLng-$(date +"Y-m-d").sql
It just dumps all data into a date stamped SQL file.
Backup with API
If you are interested in CSV/TSV data as backup, you can do this by requesting all data for a channel via the API
# GUID=your_GUID_here # wget -O PVLng-$GUID-$(date +"Y-m-d").csv http://your.domain.here/api/latest/data/$GUID.csv?start=0&full=1
- start=0 - all data since 1970...
- full=1 - extract also the readable date+time column
As script:
#!/bin/sh ### Configure these 3 parameters: host="http://your.domain.here" # WITH http:// apikey="..." # required for private channels destination=~ # destination=/backup/PVLng/$(date +"Y-m-d") ### Let's go test -d "$destination" || mkdir -p "$destination" echo $(date) Start backup ... IFS=";" ### Read all real channels, but WITHOUT RANDOM channels wget -qO - $host/api/latest/channels.csv | grep -v "RANDOM " | \ while read id GUID name dummy; do ### remove text delimiters name=$(echo "$name" | sed -e '~"~~g') echo -n "$(date) $GUID - $name ... " wget -q --header="X-PVLng-Key: $apikey" -O $destination/$GUID-$(date +"Y-m-d").csv $host/api/latest/data/$GUID.csv?start=0&full=1 echo Done. done echo $(date) Finished.