Backup Routine

Discussion in 'Chit Chat' started by TheJinJ, Feb 13, 2015.

  1. TheJinJ

    TheJinJ Active Member Licensed User

    Lost some important work recently so revised my current setup.

    I went and bought a Dlink DNS-320L + 2 x 1Tb Drives in RAID 1, this is a cheap NAS ~£40 - flashed it with Alt-F firmware over stock which allows more control of the linux based system.
    Installed rsync and created some read only Samba shares to the backup area.
    Found a nice free rsync GUI QtdSync which schedules incremental backups to the 'server' - uses hardlinks so a great space saver :) Running very nicely and using the same setup to backup the various PC's and laptops in my house. If a restore is needed then they various versions can be taken from the share.

    Another interesting find was AutoVer, a real time backup and versioning system. I never used this in the end because it creates a real time copy every time the watched files are saved which ultimately ended up in a LOT of copies which wasn't so good with large files, good app though.

    Also now using SVN on the NAS so I can use various laptops, desktops and not keep moving files on USB sticks.

    Thinking of also adding cloud storage in the future....

    What setup do you have in place, if any?
    KMatle likes this.
  2. WAZUMBi

    WAZUMBi Well-Known Member Licensed User

    For development purposes - Two flash drives.
    One kept in my safe and one on my key chain.
    I backup every day I do work.

    I backup everything including my personal stuff about weekly on an external hard drive and keep that in my safe as well.

    That Dlink NAS looks pretty cool though...:cool:
  3. KMatle

    KMatle Expert Licensed User

    Once a month I zip my "b4a-Folder" and upload it to google drive. Additionally I make a disk-image from time to time.
  4. udg

    udg Expert Licensed User

    Years ago, well before SSDs and Cloud were available, my pathological attitude towards data safety leaded to:
    1. APC UPS with batteries sized for 4 hours of operation
    2. HP server with disks organized in a RAID 5 configuration
    3. NAS from Buffalo for a scheduled daily backup
    4. My brother to burn some DVDs once a week and store them at his home

    To say all the truth.. I had a "twin" server on that same LAN where the most critical data were "silently" recorded at the same time they went to the official server.
    It was disguised as an old box taking dust in a corner so nobody ever played any attention to it.. Luckily I never needed to recover data from it. :)
  5. Troberg

    Troberg Well-Known Member Licensed User

    I store the master on a NAS with a two disk redundancy (ie two disks can fail without losing any data), and copies on two different disks in two different computers.

    In addition to that, much of my source, for various reasons (testing, developing on laptop when away and so on), exist on other computers as well.

    No true off-site backups, though. I should see to that.
  6. JakeBullet70

    JakeBullet70 Well-Known Member Licensed User

    GIT Branches in the cloud. Automated nightly backup of my complete dev drive to local MS Svr 2008 with raid 5.
  7. thedesolatesoul

    thedesolatesoul Expert Licensed User

    I just want to point out that RAID is not a backup solution and has nothing to do with backups. It improves the fault tolerancy, reliability and availability. It wont create a recoverable backup to an older version of data you need.
    WAZUMBi and udg like this.
  8. JakeBullet70

    JakeBullet70 Well-Known Member Licensed User

    Great point! That's what I really like about GIT.
    Also just installed AutoVer and pointed that at the Raid 5. Will report back later on how its working.
    thedesolatesoul likes this.
  9. eps

    eps Well-Known Member Licensed User

    I just take a copy of my whole drive now and again - not ideal I know, but all of the code (not B4A) I work with is on the Clients' system as well, in Dev, Test and Production. I just let Time Machine nag me about taking a backup.
  10. thedesolatesoul

    thedesolatesoul Expert Licensed User

    I use Auslogics BitReplica ( it is very good. I make a full backup of developer stuff every week, and a differential backup every day. I upload it to my dropbox usually after i hear a horror story so not frequently enough.
    I only image my system drive after a fresh install and with all my apps installed. I dont see why i need to re-image the whole drive. If I do get a virus I just go back to the original, install updates, re-image. All my data and programs are in separate drives.
    I have nothing that would save me from drive failure, backup failure (happens when eclipse locks a file) or network failure.

    I might take some ideas from this thread.
  11. thedesolatesoul

    thedesolatesoul Expert Licensed User

    What service do you use for GIT? Or do you have your own server?
  12. TheJinJ

    TheJinJ Active Member Licensed User

    Important point :)
    As I don't have offsite backup as yet it gives me some piece of mind that if a drive fails I can still recover, the chances of the pc being backed up and both RAID drives failing seems pretty slim!....more chance my house will burn down and I'll lose the lot....
    thedesolatesoul likes this.
  13. TheJinJ

    TheJinJ Active Member Licensed User

    If you're working with large files just watch the space with Autover, it doesn't monitor changes and treats locked files as changed even if they haven't.
    JakeBullet70 and thedesolatesoul like this.
  14. JakeBullet70

    JakeBullet70 Well-Known Member Licensed User

    thedesolatesoul likes this.
  15. JakeBullet70

    JakeBullet70 Well-Known Member Licensed User

    Now this looks cool! Duplicati

    Just installed and its working fine.
    TheJinJ likes this.
  16. JakeBullet70

    JakeBullet70 Well-Known Member Licensed User

    AutoVer uninstalled. :( I went with Duplicati
  17. TheJinJ

    TheJinJ Active Member Licensed User

    That does look cool :) Even backs up open files etc...
  18. Troberg

    Troberg Well-Known Member Licensed User

    True. It will also not help you if you by mistake delete your entire source directory (say, your cat walks over the keyboard and hits the wrong keys).

    So, backups on other disks are necessary, and they should not be done too often. You need to notice the problem before it's replicated to the backups.

    For disaster recovery, say, a fire, you need off-site backups. I'll probably do something like burning a bunch of DVD's or copying to a protable disk every now and then and keep them at work or at my mother's house. Something that can take out two sites 5 km from each other is probably on a scale that makes losing my source a small problem...

    As for getting old versions of files, I've never had that need. If I feel that I'm doing som risky change, I do a copy first, and that's enough for me.

    Edit: By the way, if somone is shopping for a NAS, I can wholeheartedly recommend Synology. I have their DS-2413+, and is extremely happy with it. Solid quality in every detail. When that's full, I'll go with Synology on my next NAS as well. (I promise, they didn't pay me to say that, they are that good!)
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.
    Dismiss Notice