Backup Routine

TheJinJ

Active Member
Licensed User
Longtime User
Lost some important work recently so revised my current setup.

I went and bought a Dlink DNS-320L + 2 x 1Tb Drives in RAID 1, this is a cheap NAS ~£40 - flashed it with Alt-F firmware over stock which allows more control of the linux based system.
Installed rsync and created some read only Samba shares to the backup area.
Found a nice free rsync GUI QtdSync which schedules incremental backups to the 'server' - uses hardlinks so a great space saver :) Running very nicely and using the same setup to backup the various PC's and laptops in my house. If a restore is needed then they various versions can be taken from the share.

Another interesting find was AutoVer, a real time backup and versioning system. I never used this in the end because it creates a real time copy every time the watched files are saved which ultimately ended up in a LOT of copies which wasn't so good with large files, good app though.

Also now using SVN on the NAS so I can use various laptops, desktops and not keep moving files on USB sticks.

Thinking of also adding cloud storage in the future....

What setup do you have in place, if any?
 

WAZUMBi

Well-Known Member
Licensed User
Longtime User
For development purposes - Two flash drives.
One kept in my safe and one on my key chain.
I backup every day I do work.

I backup everything including my personal stuff about weekly on an external hard drive and keep that in my safe as well.

That Dlink NAS looks pretty cool though...:cool:
 

udg

Expert
Licensed User
Longtime User
Years ago, well before SSDs and Cloud were available, my pathological attitude towards data safety leaded to:
1. APC UPS with batteries sized for 4 hours of operation
2. HP server with disks organized in a RAID 5 configuration
3. NAS from Buffalo for a scheduled daily backup
4. My brother to burn some DVDs once a week and store them at his home

To say all the truth.. I had a "twin" server on that same LAN where the most critical data were "silently" recorded at the same time they went to the official server.
It was disguised as an old box taking dust in a corner so nobody ever played any attention to it.. Luckily I never needed to recover data from it. :)
 

Troberg

Well-Known Member
Licensed User
Longtime User
I store the master on a NAS with a two disk redundancy (ie two disks can fail without losing any data), and copies on two different disks in two different computers.

In addition to that, much of my source, for various reasons (testing, developing on laptop when away and so on), exist on other computers as well.

No true off-site backups, though. I should see to that.
 

JakeBullet70

Well-Known Member
Licensed User
Longtime User
GIT Branches in the cloud. Automated nightly backup of my complete dev drive to local MS Svr 2008 with raid 5.
 

JakeBullet70

Well-Known Member
Licensed User
Longtime User
I just want to point out that RAID is not a backup solution and has nothing to do with backups. It improves the fault tolerancy, reliability and availability. It wont create a recoverable backup to an older version of data you need.

Great point! That's what I really like about GIT.
Also just installed AutoVer and pointed that at the Raid 5. Will report back later on how its working.
 

eps

Expert
Licensed User
Longtime User
I just take a copy of my whole drive now and again - not ideal I know, but all of the code (not B4A) I work with is on the Clients' system as well, in Dev, Test and Production. I just let Time Machine nag me about taking a backup.
 

thedesolatesoul

Expert
Licensed User
Longtime User
I use Auslogics BitReplica (http://www.bitreplica.com/) it is very good. I make a full backup of developer stuff every week, and a differential backup every day. I upload it to my dropbox usually after i hear a horror story so not frequently enough.
I only image my system drive after a fresh install and with all my apps installed. I dont see why i need to re-image the whole drive. If I do get a virus I just go back to the original, install updates, re-image. All my data and programs are in separate drives.
I have nothing that would save me from drive failure, backup failure (happens when eclipse locks a file) or network failure.

I might take some ideas from this thread.
 

TheJinJ

Active Member
Licensed User
Longtime User
I just want to point out that RAID is not a backup solution and has nothing to do with backups. It improves the fault tolerancy, reliability and availability. It wont create a recoverable backup to an older version of data you need.

Important point :)
As I don't have offsite backup as yet it gives me some piece of mind that if a drive fails I can still recover, the chances of the pc being backed up and both RAID drives failing seems pretty slim!....more chance my house will burn down and I'll lose the lot....
 

TheJinJ

Active Member
Licensed User
Longtime User
Great point! That's what I really like about GIT.
Also just installed AutoVer and pointed that at the Raid 5. Will report back later on how its working.

If you're working with large files just watch the space with Autover, it doesn't monitor changes and treats locked files as changed even if they haven't.
 

JakeBullet70

Well-Known Member
Licensed User
Longtime User
Now this looks cool! Duplicati

Duplicati is a backup client that securely stores encrypted, incremental, compressed backups on cloud storage services and remote file servers. It works with Amazon S3, Windows Live SkyDrive, Google Drive (Google Docs), Rackspace Cloud Files or WebDAV, SSH, FTP (and many more). Duplicati is open source and free.

Just installed and its working fine.
 

Troberg

Well-Known Member
Licensed User
Longtime User
I just want to point out that RAID is not a backup solution and has nothing to do with backups. It improves the fault tolerancy, reliability and availability. It wont create a recoverable backup to an older version of data you need.

True. It will also not help you if you by mistake delete your entire source directory (say, your cat walks over the keyboard and hits the wrong keys).

So, backups on other disks are necessary, and they should not be done too often. You need to notice the problem before it's replicated to the backups.

For disaster recovery, say, a fire, you need off-site backups. I'll probably do something like burning a bunch of DVD's or copying to a protable disk every now and then and keep them at work or at my mother's house. Something that can take out two sites 5 km from each other is probably on a scale that makes losing my source a small problem...

As for getting old versions of files, I've never had that need. If I feel that I'm doing som risky change, I do a copy first, and that's enough for me.

Edit: By the way, if somone is shopping for a NAS, I can wholeheartedly recommend Synology. I have their DS-2413+, and is extremely happy with it. Solid quality in every detail. When that's full, I'll go with Synology on my next NAS as well. (I promise, they didn't pay me to say that, they are that good!)
 
Top