Archiving files on a remote site when file content changes

aRoot DEV TIPS, Engineer2Engineer, file replication software

The need for file archives

Did you ever edit a file, save it and then say oh S**$? Were you ever working on your code at 3am as to find a  zillion errors that you did not expect? Did you ever do “#\rm -fr *?  because you worked an 80 hour week and you lived on espresso ?

As a developer I had countless times when I wish I saved a copy of a script, .h, .c or .java before a fat finger hit dd or 1000d (and of course the :wq to save in vi).  svn, git or cvs will not have saved me since I did not finish testing the code to checked in. To solve this problem, I continuously backup my sandbox to a remote server and I can revert to previous states of my code any time.

You are working on some super secret project and you don’t want Bob the system administrator at a remote backup location to take your data and sell it to Joe the competitor do you?

So you as a loyal and trustworthy sysadmin will solve this problem using your own private cloud over which your have full control.

Creating unlimited archives

EDpCloud can create an unlimited number of archives on a remote site for safe backup. All you have to do is (a) configure realtime replication or (b) create a schedule to backup data continuously and configure the receiver to create a new archive file each time the content of the file changes. It will save your employer money and save your reputation as well as sanity.

For example, if file foo.c changes on the local machine, you will have foo.c and its older archive foo.c.$md5 where $md5 is the md5 signature of foo.c before it changed.

Example configuration

The archive parameter controls the ability to go back and restore various versions of the file. (See eddist.cfg man page (UNIX) or eddist.cfg.html (Windows and all the other operating systems).

<?xml version=”1.0″ encoding=”UTF-8″?>

<config name=”enduradata” password=”Addoud4d4ch1n1gh4T4s4″ workers=”4″>
  <link name=”u” password=”foo”>
    <sender hostname=”localhost”  alias=”*”
    />
    <receiver hostname=”192.168.200.241″
        storepath=”/home/backup”   archive=”1″ archivedir=”/home/archives”
    />
  </link>
</config>

The archive parameter controls the ability to go back and restore various versions of the file.

 Best practice

A good practice is to setup a link (replication set) for your sandbox or for your entire working directory, configure file replication for it and set up the archive parameter, specifically for that directory, and setup a schedule to backup your files. Make sure you setup an include regex to exclude .o files.

–a elhaddi

More Tips:

Archiving files on a remote site when file content changes was last modified: April 2nd, 2018 by aRoot