Super Spread Sheet S³

Or little computing tricks and hacks

Backing up git repositories and cron

All of our important files and ongoing work is managed through git repositories. This facilitates immensely sharing and updating among us. Ideally one would have a bare (or central) repository which is a git repository that does not have a working tree. This repository allows users to «push» their changes into it. If this bare repository is understood as central for all users, depositing (or pushing) and retrieving (or pulling) from it, guarantees the propagation of changes among everybody and, very important keeps updated copies in different locations.

But sometimes, one forgets to commit the latest changes or to pull on a regular basis. (For the more information in git, bare repository, commit, push and pull got to here or here.) I created some scripts to run automatically using cron, for this purpose. The workflow we want is:

  1. pull from the bare repository
  2. commit your changes locally
  3. push your changes to the bare respository

Depending on the nature of your work or file structure, you might want to pull and push only on selected working trees (really, directories). I created a script for each of these two functions, each reading from a list of selected directories.

After having tested the scripts, I set up the cron jobs to run in the middle of the night, as to ensure that I’m not working on a tree at that moment. This is done by running the command crontab -e and simply adding at the end of the file, the command to run, following the syntax described in that file:

minute hour dayOfMonth month dayOfWeek command

To run the command everyday, replace dayOfMonth , month and dayOfWeek by an asterix. For example:


2 2 * * * /full/path/to/commit_and_pull_script 2> /full/path/to/error_log_file

will run a script which commits the changes and then pulls from the repository at 2:02 in the morning, logging any errors in the error_log_file. If that file does not exist and there is an error, you might get the following in the syslog file:


(CRON) info (No MTA installed, discarding output)

Meaning that cron tried to mail the error, but there is no Mail Transfer Agent set up in the systemi, so cron could not send the email, and the error log is lost.

Do not run the command crontab alone to look or change you cron table, as it will wipe it out. No back-up! The actual file is stored at /var/spool/cron/crontabs/username. So as a precaution, a put I copy of that file in my home directory.

Log file

I keep logs for the output of the scripts to know what was done and if there were any errors. To clean up the logs periodically, I use logrotate, again as a scheduled task. I simulate what is done at the linux system level: I create the directories var and etc locally in my home directory. Under var, I create lib, where the status file of logrotate is store, which is created by the process itself (logrotate.status); and log, where all the logs from the scripts are directed to. The configuration file for logrotate (logrotate.conf) is stored under etc and it is created by the user. This files specifies which logs are to be rotated and how often. The format is simple and you can find examples using man.

To schedule the job:

1 1 * * * /usr/sbin/logrotate -s /full/path/logrotate.status /full/path/logrotate.conf > /full/path/logrotate.log 2> /full/path/logrotate.err

Will run logrotate at 1:01 every morning, looking into logrotate.status, using the configuration in logrotate.conf saving a log in logrotate.log with errors to logrotate.err. Make sure that all of the directories exist prior to running the commands.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: