Committee Members
CategoryMeatball CategoryMeatballWiki
After creating an S3 account, and a S3 bucket was created using the /home/backup/s3-create.pl script (which uses SOAP::Amazon::S3 from CPAN).
Every night, /etc/cron.daily/backup calls /home/backup/backup which dumps the MeatballWiki PostgresQL? database (add more if you need to), rotates the BibWiki logs, cleans out the mailboxes, and uses duplicity with BitBucket to incrementally backup the system. Once a month, /etc/cron.monthly/backup calls /home/backup/backup full to do a full snapshot (key frame) backup.
Edit the BACKUP-MANIFEST to include more in the backup. Currently, that file looks like:
/usr/lib/perl5/site_perl/5.8.7 /svn /home
To restore, use the /home/backup/restore <time interval> script. By default, it restores the last revision. You can restore a specific file in the archive using the /home/backup/restore-file <absolute-path> script. It will put the restored files in the /home/backup/restored directory.
You can get a full list of files in the archive using /home/backup/list-current-files.
Refactored into BackupAndRestore.
Assuming [MeatballDatabaseRelations], could use the revision column to determine whats changed. SELECT * FROM revisions WHERE revision > //previousBackUpRevision//. If previousBackUpRevision == 0 then get a full backup of the [VersionHistory]. -- JaredWilliams
Currently, the entire compressed binary pg_dump is around 25MB. Even if the backup diff algorithm cannot do anything with it, and assuming no more growth of the database, it will take 41 weeks before it costs me 15 cents USD. I don't think it is worth it to optimize it. -- SunirShah