The need for file archives
Did you ever edit a file, save it and then say oh S**$? Were you ever working on your code at 3am as to find a zillion errors that you did not expect? Did you ever do “#\rm -fr *? because you worked an 80 hour week and you lived on espresso ?
As a developer I had countless times when I wish I saved a copy of a script, .h, .c or .java before a fat finger hit dd or 1000d (and of course the :wq to save in vi). svn, git or cvs will not have saved me since I did not finish testing the code to checked in. To solve this problem, I continuously backup my sandbox to a remote server and I can revert to previous states of my code any time.
You are working on some super secret project and you don’t want Bob the system administrator at a remote backup location to take your data and sell it to Joe the competitor do you?
So you as a loyal and trustworthy sysadmin will solve this problem using your own private cloud over which your have full control.
Creating unlimited archives
EDpCloud can create an unlimited number of archives on a remote site for safe backup. All you have to do is (a) configure realtime replication or (b) create a schedule to backup data continuously and configure the receiver to create a new archive file each time the content of the file changes. It will save your employer money and save your reputation as well as sanity.
For example, if file foo.c changes on the local machine, you will have foo.c and its older archive foo.c.$md5 where $md5 is the md5 signature of foo.c before it changed.
The archive parameter controls the ability to go back and restore various versions of the file. (See eddist.cfg man page (UNIX) or eddist.cfg.html (Windows and all the other operating systems).
<?xml version=”1.0″ encoding=”UTF-8″?>
<config name=”enduradata” password=”Addoud4d4ch1n1gh4T4s4″ workers=”4″>
<link name=”u” password=”foo”>
<sender hostname=”localhost” alias=”*”
storepath=”/home/backup” archive=”1″ archivedir=”/home/archives”
The archive parameter controls the ability to go back and restore various versions of the file.
A good practice is to setup a link (replication set) for your sandbox or for your entire working directory, configure file replication for it and set up the archive parameter, specifically for that directory, and setup a schedule to backup your files. Make sure you setup an include regex to exclude .o files.
- Disaster Recovery: Disaster Avoidance, Escape and Tolerance
- Cross-border Data Transfer
- On Business Continuity Plan (BCP)
- The real costs of data loss and imperfect backups
- Intelligent High Speed Bidirectional Real Time File Replication with IPv6 for Telecommunications
- Backup and data replication for data protection
- Data backup: Protect your data before it is too late.
- Data Migration and File Replication
- On Data Leaks and Data Breaches
- File replication software for data migration