Difference between revisions of "Systems Backups"

From Pumping Station: One Wiki
Jump to: navigation, search
Line 1: Line 1:
 +
== Current backup situation ==
  
  
At present (1/16/2016) it is currently unclear how much is being backed up and to where such backups are being stored.  A previous attempt at setting up a system to handle backups has fallen by the wayside after much effort.  It would be wise to get a simple backup script running that pulls down nightly backups to one of the hacktops (for the time being) using something as simple as rsync.  This can be as easy as backing up configuration files and databases, or full system backups meant to be usable for restoration.
+
Our backup system is still in an somewhat undetermined state. However, it is getting somewhat better than what it was before (namely… nothing).
  
 +
Current systems that have backups:
 +
- Canvas: Database (dump), Config files File storage is on S3 with no replication yet though (but there is nothing on it)
 +
- Bob (our Samba/LDAP): Samba database, weekly systemd log, /etc, /var/log (without journal and lastlog) and /srv
 +
- Rt: Database (dump), full rt4 (that's dumb, but better than nothing), weekly systemd log
 +
- PS1Auth: Database (dump)
 +
- Wiki: Database (dump), and a single copy (not daily) of the files not encrypted as our wiki is public.
  
At the moment this command, wrapped in a script, fed an exclude file, is performing one off backups of key VMs
+
=== What has to be done ===
  
rsync -azP  --exclude-from '/home/ps1user/Backups/exclude-list.txt' root@bob.ad.pumpingstationone.org:/ /home/ps1user/Backup/bob.ad.pumpingstationone.org/
+
- A more clean system (we can all dream)
 +
- A logging system that can alert if a backup is failing and so on
 +
- Bob: We need maybe a nice export of the Samba database.
 +
- Wiki: We need a better way to handle the daily backup of all the files, but it is huge, 4.3GB of a lot of little files
 +
 
 +
== How does that work.==
 +
 
 +
 
 +
=== Everything is copied to S3 (AWS) ===
 +
 
 +
Everything is copied to S3 using that script to generate the access: https://github.com/bjonnh/s3-wizard
 +
We have policies on AWS that are similar to the ones in this repo. Namely, users created under that specific group are able to access the
 +
"ps1-systems-backup" bucket.
 +
 
 +
To create a new bucket and associated credentials:
 +
    aws_bucket_creator.py -b ps1-machine-backup -u some-backup -p ps1-s3-wizard -r us-east-2 -e -E -a -P arn:aws:iam::499897270974:policy/AccessBackupsByUserNameOnly
 +
 
 +
The use of that is restricted to people that have an account on AWS. As this can incur charges, this is something you will have to ask a user for yourself. This is not meant for backing up your personal machines (but you can use your own AWS account for that and use the same scripts).
 +
 
 +
=== Every machine (ahem) is backed up daily using s3cmd ===
 +
 
 +
Ansible playbooks are being made for all the machines so s3cmd, systemd (or cron.daily depending on the Linux flavor) services and the backup script are run.  
 +
The backup scripts are simple bash scripts that:
 +
- dump the eventual database(s) into a file
 +
- dump the config files
 +
- dump the stored files
 +
- dump whatever else
 +
 
 +
and everything is then encrypted using GPG symmetric keys (the key is in LastPass somewhere).
 +
 
 +
The files have to be decrypted with:
 +
    gpg -d nameofthefile 
 +
 
 +
That will ask you for the password.
 +
 
 +
The backups are saved with the day number and a file in them with "last-backup-2017-11-22" kind of files. That way we have by default a daily backup for the last 30 days. We could make something more clever and use snapshots and so on… Maybe one day.

Revision as of 21:12, 22 November 2017

Current backup situation

Our backup system is still in an somewhat undetermined state. However, it is getting somewhat better than what it was before (namely… nothing).

Current systems that have backups: - Canvas: Database (dump), Config files File storage is on S3 with no replication yet though (but there is nothing on it) - Bob (our Samba/LDAP): Samba database, weekly systemd log, /etc, /var/log (without journal and lastlog) and /srv - Rt: Database (dump), full rt4 (that's dumb, but better than nothing), weekly systemd log - PS1Auth: Database (dump) - Wiki: Database (dump), and a single copy (not daily) of the files not encrypted as our wiki is public.

What has to be done

- A more clean system (we can all dream) - A logging system that can alert if a backup is failing and so on - Bob: We need maybe a nice export of the Samba database. - Wiki: We need a better way to handle the daily backup of all the files, but it is huge, 4.3GB of a lot of little files

How does that work.

Everything is copied to S3 (AWS)

Everything is copied to S3 using that script to generate the access: https://github.com/bjonnh/s3-wizard We have policies on AWS that are similar to the ones in this repo. Namely, users created under that specific group are able to access the "ps1-systems-backup" bucket.

To create a new bucket and associated credentials:

   aws_bucket_creator.py -b ps1-machine-backup -u some-backup -p ps1-s3-wizard -r us-east-2 -e -E -a -P arn:aws:iam::499897270974:policy/AccessBackupsByUserNameOnly

The use of that is restricted to people that have an account on AWS. As this can incur charges, this is something you will have to ask a user for yourself. This is not meant for backing up your personal machines (but you can use your own AWS account for that and use the same scripts).

Every machine (ahem) is backed up daily using s3cmd

Ansible playbooks are being made for all the machines so s3cmd, systemd (or cron.daily depending on the Linux flavor) services and the backup script are run. The backup scripts are simple bash scripts that: - dump the eventual database(s) into a file - dump the config files - dump the stored files - dump whatever else

and everything is then encrypted using GPG symmetric keys (the key is in LastPass somewhere).

The files have to be decrypted with:

    gpg -d nameofthefile  

That will ask you for the password.

The backups are saved with the day number and a file in them with "last-backup-2017-11-22" kind of files. That way we have by default a daily backup for the last 30 days. We could make something more clever and use snapshots and so on… Maybe one day.