Personal tools

Difference between revisions of "Backup"

From PhotoVoltaic Logger new generation

Jump to: navigation, search
m
m (Backup with API)
Line 14: Line 14:
  
 
  # GUID=your_GUID_here
 
  # GUID=your_GUID_here
  <nowiki># wget -O PVLng-$GUID-$(date +"Y-m-d").csv http://your.domain.here/api/r3/data/$GUID.csv?start=0&full=1</nowiki>
+
  <nowiki># wget -O PVLng-$GUID-$(date +"Y-m-d").csv http://your.domain.here/api/latest/data/$GUID.csv?start=0&full=1</nowiki>
  
 
* <tt>start=0</tt> - all data since 1970...
 
* <tt>start=0</tt> - all data since 1970...

Revision as of 16:42, 28 September 2014

Database backup

A full database backup is quite simple.

# mysqldump -u [username] -p[password] -h [host] [databaseName] >PVLng-$(date +"Y-m-d").sql

It just dumps all data into a date stamped SQL file.

Backup with API

If you are interested in CSV/TSV data as backup, you can do this by requesting all data for a channel via the API

# GUID=your_GUID_here
# wget -O PVLng-$GUID-$(date +"Y-m-d").csv http://your.domain.here/api/latest/data/$GUID.csv?start=0&full=1
  • start=0 - all data since 1970...
  • full=1  - extract also the readable date+time column

As script:

#!/bin/sh

### Configure these 3 parameters:
host="http://your.domain.here"   # WITH http://
apikey="..."                     # required for private channels
destination=~
# destination=/backup/PVLng/$(date +"Y-m-d")

### Let's go
test -d "$destination" || mkdir -p "$destination"

echo $(date) Start backup ...

IFS=";"

### Read all real channels, but WITHOUT RANDOM channels
wget -qO - $host/api/r3/channels.csv | grep -v "RANDOM " | \
while read id GUID name dummy; do
  ### remove text delimiters
  name=$(echo "$name" | sed -e '~"~~g')

  echo -n "$(date) $GUID - $name ... "
  wget -q --header="X-PVLng-Key: $apikey" -O $destination/$GUID-$(date +"Y-m-d").csv $host/api/r3/data/$GUID.csv?start=0&full=1
  echo Done.
done

echo $(date) Finished.