Backing up Files to Google Drive via the CLI

Page Header
I’ve written before about backing up files using the gdrive client. The gdrive client has been updated since then (now version 2) and I’ve updated my backup scripts, so I figured I should share them.

Setting up gdrive 2

It’s not too different from before, just first pull down the binary for your Operating System (in my case I used the gdrive-linux-x64 binary):

1
wget -O gdrive https://docs.google.com/uc?id=0B3X9GlR6EmbnQ0FtZmJJUXEyRTA&export=download

Mark it executable:

1
chmod +x gdrive

And put it somewhere useful for executing:

1
mv gdrive /bin/

Then, to set it up (using the account that you’re going to be running backups under), run:

1
gdrive about

Which will then prompt you to open a link in your browser, give permission to the application, and then provide you with a token to copy and paste back into the terminal.

Backing up Docker Data

I use docker for a fair few of my homelab services now, and I’ve set up my environment to hold all configuration and persistent data within a single directory, /var/docker/.

I also now use 7zip to back up my directories instead of tar as it’s cross-platform and compresses and encrypts in a single step. You can easily install it on CentOS 7 with yum install p7zip. Here’s my dockerbackup.sh file:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
#!/bin/bash

cd /var/docker/compose

# stop the containers
docker-compose stop

# zip them up encrypted (yes, the password has to be after the -p switch with no space)
7za a -t7z -mx2 -pAReallyStrongPassword /tmp/docker-backup-$(date '+%Y-%m-%d').7z /var/docker/

# start the containers
docker-compose start

# upload to google drive, with the parent folder ID set
gdrive upload --parent 09KVkjJK83exTfkQ3cEJKFDSkjvadkdghJKDGKnRGJKVD932jkaknvKDVR28 --delete /tmp/docker-backup-$(date '+%Y-%m-%d').7z

exit 0

The --delete flag will clean up the file after it’s been uploaded, so we don’t have to worry about cleaning up after the backup.

The downtime of the containers is really dependent on how large the persistent data is. For me the compressed file ends up being about 2GB in size and takes about 10 mins to run, which is an acceptable downtime for me. The actual uploading happens after services are restored so I don’t care how long it takes.

I add this line to /etc/crontab to run the backup at 6am every Monday, Wednesday and Friday (you can use this site to understand crontab entries easier):

1
0 6 * * 1,3,5 root /scripts/dockerbackup.sh

And that’s it. Super easy.

Backing up Confluence Data

I don’t have confluence running as a container yet, so my confluence backup happens separately, but it’s fundamentally the same:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
#!/bin/bash

#stop confluence to prevent corruption
systemctl stop confluence

#back up confluence
pg_dump confluence -U confluenceadmin -h localhost > /tmp/confluence.sql

7za a -t7z -mx2 -pAReallyStrongPassword /tmp/confluence-backup-$(date '+%Y-%m-%d').7z /var/atlassian/application-data/confluence/ /tmp/confluence.sql

# clean up the database dump
rm -f /tmp/confluence.sql

#start confluence
systemctl start confluence

#upload to google drive
gdrive upload -p 09KVkjJK83exTfkQ3cEJKFDSkjvadkdghJKDGKnRGJKVD932jkaknvKDVR28 --delete /tmp/confluence-backup-$(date '+%Y-%m-%d').7z

exit 0

Just like with my docker host, I have my backup script executed by cron. To prevent the two machines from hogging the internet connection simultaneously though I’ve set them to run on alternating days, this one running at 6am on Tuesday, Thursday and Saturday:

1
0 6 * * 2,4,6 root /scripts/confluencebackup.sh

Now with these backup scripts I get easy to organise archives going up to my cloud storage with minimal effort. As they’re in the 7zip format, I can easily also utilise the same commands on Windows for backing up directories, and I can also explore the archives for individual files on any platform that has 7zip installed.