This blog post is part of a multi-part series. Before reading this blog post, make sure you take a look at my previous post, How to Backup a cPanel Website. You might also want to look at How to Backup a cPanel website to Amazon S3.
Now let’s look at how to backup a cPanel website to Google Cloud. Transferring backups to Google Cloud storage is only possible in the latest cPanel backup configuration, not in their legacy backup system. If you haven’t configured your backups in a while, take a look at how they’re improved.
As I mentioned previously, you should specify daily, weekly, and monthly backups with different retention periods. When directing those backups to the cloud, each of those backups will take up space, and keeping weekly and monthly backups longer gives you a much longer retention period.
Backing up to Google Storage
Backing up to Google Cloud is not integrated with cPanel, the way Amazon S3 is. But there is a way and it’s not that hard. You probably already have one of the things you need to make this happen: a Google account. Just click on the Google Cloud Storage website and login with your Google account. (Or if you want, use a separate account for this purpose.)
If this is your first time logging into Google Storage, it will require you to give your full contact information and enter a valid credit card for billing. Then search on Bucket. Google will then ask you to enable billing.
Setup Google Bucket
Once you’re at the Google Storage screen, click Create Bucket. You then need to come up with a name that is unique across all buckets in Google Storage. The name doesn’t really matter; it just needs to be unique. If you pick a name someone else has picked, you’ll get a message like this:
Once you pick a valid bucket name, you need to select the region. With Google, your choice here is a bit more obvious than with Amazon’s.
Create Google Credentials
Now you need an access key, which allows a particular user to log in to Google Storage. To do this, you’ll need to create what Google calls a Service Account by searching for Credentials in the console and click on it. You will go to the Credentials page, where you should click Create credentials, then Service account key.
Now you’re presented with a dialogue box where you need to give the service account a name and role. The most obvious role would be Storage Admin. Leave JSON as the key type.
Once you click Create, a JSON key file will be downloaded to your local PC. You need to then transfer that file to the cPanel server you want to back up.
Mount Google Cloud Storage as a remote filesystem
Since there is no direct connection to Google Cloud Storage in cPanel, you need to use Google Fuse (gcsfuse) to be able to mount Google Cloud as a filesystem. To do this, follow the instructions here to install and authenticate gcsfuse on your system.
Once gcsfuse is installed, you need to put an entry in the /etc/fstab to use it. The following entry specifies to mount the curtistest bucket to /tbackups using the file /root/gcreds.json for authentication.
curtistest /tbackups gcsfuse rw,noauto,user,key_file=/root/gcreds.json
Assuming you have created the /tbackups directory, you can just tell Linux to mount the drive.
# mount /tbackups
Configure Cpanel Backups to use the mounted google drive
In the WHM console, click Backup Configuration, and scroll to the bottom looking for additional destinations. Select Additional Local Directory the Destination Type and Create new destination.
In the configuration page for this destination, give it a name and select the box that says transfer system backups to destination. (It says you should only do this if your connection is encrypted. It is unclear right now whether this connection is an encrypted connection.)
One final choice is whether or not to leave the backups on the local drive for quick restores. Feel free to do that if you have the space. To do that, just click Retain backups in the default backup directory.
What about Coldline?
I initially thought I would be transferring backups to Coldine. I would be making backups far less often than I would be restoring backups (hopefully), and that sounded like it was perfect for Coldline. However, Coldline has a minimum retention period of 90 days. So while it is less expensive per gigabyte, it actually would cost me more money since it would force me to keep all backups for 90 days. Therefore, all of my backups go directly to Multiregional Cloud Storage. If you’re backup configuration is different, and the 90 day minimum policy isn’t a problem for you, you can configure your bucket to store data on Coldline.