Autobackup is an automated backup system which retrieves data to backup from a remote server and stores it on the server. In our case, we can use a Vultr Storage Instance with huge amounts of disk space and backup all our Compute Instances through that in order to prevent any data loss.
You can install AutoBackup using
git easily on your system. Go ahead and clone the repository:
mkdir /opt/ git clone https://github.com/fbrandstetter/Autobackup.git /opt/autobackup/
Before we can start backing up data from our Compute Instances, we need to gain access to them. For that, we'll create a SSH key on our Storage Instance and grant access from it to all Compute Instances. Next, we'll start with creating the key:
As of now, we have to copy our public key to the Compute Instances. Open the following file
~/.ssh/id_rsa.pub and copy it to the Compute Instances'
If you don't work with public keys to gain access to your Compute Instances, you have to set the authorized keys file in the SSH server config first. Open the following file
/etc/ssh/sshd_config on the Compute Instances and uncomment the following line:
Once you added the SSH key of the Storage Instance on all Compute Instances, you can go ahead with trying the connection to one of your Compute Instances ( in order to avoid any issues later, make sure the connection to all servers works ):
You should be able to login without typing in any password nor something else.
Autobackup requires some config to function properly as well. Open the
/opt/autobackup/backup.sh file, as any config is being stored in the bash file itself. Take a look at the following lines and adapt them to fit your needs:
BACKUPDIR="" PASSWORD="" FREEUPSPACE="" MAXUSED=""
All servers to backup are being stored in the
/opt/autobackup/serverlist.template file using the following format:
<SERVER_HOSTNAME OR IP>|<USERNAME FOR AUTHENTICATION>|<EXCLUDE LIST>
By default, Autobackup automatically backups the entire server, that means it tries to download
/ recursive. Because some people don't need the entire system to be backuped, you can add global excludes ( which apply to any server ) and server-specific excludes, which apply to specific servers. All global excludes are being stored in the file called
/opt/autobackup/default-excludes.template and the file is prefilled with
/dev, you can add new folders and file extensions there by simply adding new lines:
Because most people are running different types of servers ( e.g. Webservers and Database servers ) there are unique exclude lists for each server. The format of the server-specific exclude files looks the same like the global ones. You can create a new file and call it to the
EXCLUDE_LIST you set for the server in the server-list. If you don't want to have any exclude-list specified for this server, set it to
empty in the server-list. The file called
empty was already downloaded by the repository clone - this file is empty in order to have no directories or anything else excluded, while the default excludes still take effect.
In an ideal environment, we're not even supposed to restore our encrypted backups. Although when face issues and we need to retrieve our backuped data, it's quite easy to restore it. You can restore any backup file using the following command:
openssl aes-256-cbc -d -salt -in BACKUP.tar.aes -out BACKUP.restored.tar mkdir backup/ tar -xvf BACKUP.restored.tar backup/
BACKUP.tar.aes with the filename of the desired backup to restore.
BACKUP.restored.tar will be the file name of the unencrypted archive. In the example above, we've already done the next steps, which are:
Autobackup is a fully automated and quite smart backup script which handles the backups automatically for us and the huge plus is, the data is being encrypted by a password which can be nearly unlimited long. That means, as long you keep your password secure and it's long enough, nobody is able to touch your data in a timely manner. Happy Hacking!