Drew's Blog - Home - Archive

Misadventures with Gitolite, Part 2: Configuring Hooks and S3 Uploading

I've been trying to set up a Github-pages like setup that allows me to update my blog with git-push. This is the second post in a two part series on how I got there. You may want to read the first post.

So. At this point, I presume you have Gitolite installed. Hopefully you're using version 3, because this won't work with any previous versions. In this post I'm going to walk you through how to configure hooks so that you can have them under version control in the gitolite-admin repository.

Note: for the next little bit, we're going to follow along with Zane Ashby's excellent blog post that got me started with this, and then the original content will start again with the S3 uploading stuff.

There are a few server-side changes we need to make. First, make the following changes to the .gitolite.rc file:

  • Change the LOCAL_CODE variable to "$ENV{HOME}/.gitolite/local-code" (this allows us to run custom code in our hooks)
  • Change the GIT_CONFIG_KEYS variable to ".*" (this allows us to specify which hooks we want to run, as well as some S3 deployment settings, in the gitolite.conf admin file)
Save the file and run gitolite setup from the command line to update Gitolite's config settings.

Next, create the following directory structure in your gitolite-admin repository: $GITOLITE_ADMIN_ROOT/local-code/hooks/common/hooks.d

Next, copy this script to $GITOLITE_ADMIN_ROOT/local-code/hooks/common/post-receive, and chmod +x $GITOLITE_ADMIN_ROOT/local-code/hooks/common/post-receive. (This is a modified version of Zane's script that will run hooks on all branches, not just master. You can specify which hooks to run by setting the config hooks.run key (a space delimited string of hooks to run) in the gitolite.conf in your gitolite-admin repository, or you can have a .hooks file that has one line, with the same format.)

Original content starts up again here.

Now, here's the S3 deploy script. This requires you to have the AWS command line tools from Tim Kay installed (apparently there are newer ones from Amazon, but it's not clear that S3 is supported by them), and you have to specify your access_key_id and secret_access_key in the .awssecret file; see the instructions on his website. Put this script in $GITOLITE_ADMIN_ROOT/local-code/hooks/common/hooks.d/jekyll-s3-deploy, and chmod +x it. (Disclaimer: this is a work in progress, and needs some improvement, but works ok. I am not the best at shell scripting.)

Now you need to set the hooks.run, s3-deploy.branch and s3-deploy.bucket-name gitolite admin config keys, which tell gitolite which hooks to run, which git branch to deploy from, and which S3 bucket to push to, respectively. Here's the full config for my blog: repo blog.drewinglis.com
    RW+     =   drew
    config hooks.run = jekyll-s3-deploy
    config s3-deploy.branch = master
    config s3-deploy.bucket-name = blog.drewinglis.com

Now, each time I push to master on my remote server, my blog is deployed to S3! The major gripe that I have with the current setup is that I upload all files all the time. It would be nice to only upload files that change, but my blog is so small that it isn't costing me much, so I haven't gotten around to it. Anyway, I hope this has been useful. This pattern is pretty flexible, so it's easy to add your own hooks for doing different things. I will post more in the coming time.

Previous Post

If you've read this far, you should probably follow me on Twitter.

See more posts

Drew writes code for fun and (sometimes) profit. He's currently studying Computer Science at Carnegie Mellon University. He has previously worked at Facebook, Amazon, and a startup called Intersect.