Steve Jobs

Like Marco Arment I’m not qualified to eulogize Steve Jobs, but I owe a lot to him so I need to say something.

My first computer was an Apple //c. 1985. I spent a lot of time on that computer. A lot. Probably an unhealthy amount.

I was a nerdy kid to begin with, and I instantly fell in love with it. I spent endless hours on that computer. Playing games, writing programs in BASIC, and generally just exploring the new world that it opened up for me. Those hours spent in front of the computer paid off. I went on, years later, to write code professionally. It’s not an exaggeration to say that owning that Apple //c shaped who I became.

I’ve admired Steve Jobs since I was old enough to know who he was. When he founded NEXT, I desperately wanted one of those beautiful (and expensive) systems. I’ve seen every movie Pixar has put out. I’ve been inspired by his business sense, his design savvy, and his drive. He’s accomplished more in his abbreviated lifetime than most people could accomplish in ten. His Stanford commencement speech stands as one of the most inspiring things I’ve heard.

My latest computer is a MacBook Air. I spend a lot of time on that computer. A lot. Probably an unhealthy amount.

Godspeed, Steve Jobs.

Backing Up Your Data With Fog

Fog, in case you haven’t heard of it, is a fantastic cloud computing library written in Ruby. It provides a unified interface to several popular cloud computing platforms(including Amazon, Rackspace, Linode, and others), making it easy to interact with them from Ruby. It currently supports four types of cloud services: storage, compute, DNS, and CDN. Fog has become very popular lately, and serves as the backbone for Chef’s cloud computing functionality, which is how I first became aware of it.

I recently used Fog to write a backup script in Ruby to automatically send encrypted database backups from a database server running at Rackspace to Amazon’s S3 storage service. Here’s how I did it.

Overview

My script runs as the second step in a process. The first step is a shell script that calls pg_dump to dump a PostgreSQL database and then encrypts the file using GnuPG, dropping them in a backup directory on the database server.

My Fog-based script’s job is to make sure that all of the files in the backup directory get moved to S3.

Writing Files

Fogsync (my script), looks at all of the files in that directory and makes sure that they all exist in a bucket on S3. If they don’t, it copies them up there. Additionally, it deletes old backups from S3. For this customer, we keep backups for 14 days, so all backups older than that get deleted.

Let’s look at how it works:

fog = Fog::Storage.new(
  :provider => 'AWS',
  :aws_access_key_id => MY_ACCESS_KEY,
  :aws_secret_access_key => MY_SECRET
  )
directory =  fog.directories.get("MY_DIRECTORY")

files = Dir["/var/backup/*.gpg"]
for file in files do
  name = File.basename(file)
  unless directory.files.head(name)
    directory.files.create(:key => name, :body => open(file))
  end
end

Here’s what this snippet does:

  1. Creates a connection to AWS. The syntax is basically the same for connecting to all of the cloud platforms, just the parameter names are changed.

  2. Uses ‘head’ to check if the file exists and, optionally, get some metadata about it (size, modify date, etc). Think of this as the cloud equivalent to the unix stat command. You don’t want to use the ‘get’ command, as that will return the whole file, which would take a very long time if the files are large cough*voice of experience*cough.

  3. Creates the file in the given directory (“bucket” in S3 terms) if it doesn’t exist already.

If you’ve used S3, you’ll notice that Fog uses slightly different terms for things than S3 does. Because Fog works across a number of different storage providers, it uses more general terms. While this might be confusing at first if you’re familiar with a specific provider’s nomenclature, but the tradeoff is that if you want to move from one provider to another, the only thing you have to change is the code that sets up the connection (the call to Fog::Storage.new() in this example).

Deleting files

      oldest = Date.today - 14 (our date)
      directory =  fog.directories.get(MY_DIRECTORY)
      files = directory.files
      files.each do |f|
        file_date = Date.parse(f.last_modified.to_s)
        if file_date < oldest
          file.destroy
        end
      end

This is fairly straightforward as well. Get all the files in the directory and check their age, deleting the ones that are older than we want to keep.

So that, in a nutshell, is how to use Fog. This is a simplified example of course, in my production code the parameters are all pulled from configuration files, and the script emails a report of what it did, in addition to having a lot more error handling.

If you do any scripting with cloud computing, you owe it to yourself to check out Fog.

Handling Incoming Email With Your Web Application

This morning I was looking for a way to handle incoming email in a web application (similar to the way Highrise and Evernote let you email things to a special email address and have them put into their system). There are a number of ways to do this via procmail, or by using something to connect to your mail server using POP or IMAP and reading emails, but I was looking for a way to do this without having to host my own email infrastructure. Ideally, I want something like Twilio, that will receive the email and then do an HTTP POST to the endpoint of my choosing.

Here’s what I found.

CloudMailIn

Still in beta (and free while it is), this looks robust. It’s also available as a Heroku addon, if that’s how you roll.

(A tip of the hat to @peterc for pointing me to this one)

APInbox

Looks similar to CloudMailIn, though not in beta. There’s a free plan for up to 100 emails a day, and then it goes up from there. Their site was down when I first went to it this morning, which makes me a tad nervous, but that may well be an isolated thing.

SendGrid

SendGrid is a heavy hitter in the email space, mostly doing outbound delivery. They do however have a Parse API that seems to perform the same function as the other two services. I’m not sure on the pricing here, their basic plan is \$9.95 per month for 10,000 emails, but I’m not sure if that includes incoming or not. UPDATE: I heard from SendGrid. Their plans cover both incoming and outgoing, so for the in the case of the \$9.95 plan, it could be a mix of both, up to 10,000

(thanks to Twilio’s @johnsheehan for the pointer to SendGrid)

I haven’t used any of these yet, so I can’t make an endorsement of one over the other, but I thought I’d post it here in case anyone else is looking for this kind of provider. If you have experience with any of these, please comment with your opinion.

The Week in Links - 12/4/2010

Full-Ack: an Emacs interface to Ack
Ack is a useful little app for searching source code. If you ever use grep for finding things in your code, switch to ack immediately - you won’t regret it. This is a handy front end to ack for Emacs users.

Information architecture: A How to
I’ve been learning about information architecture lately as it’s becoming increasingly important for my job. This is a good overview.

Hacker’s Guide To Tea
I need to drink more tea. This article taught me a lot I didn’t know about tea and its benefits.

Tasty Treats for PostgreSQL
A bunch of useful tools if you work with PostgreSQL, from the guys at OmniTI.

HTML5Rocks - Introducing Web Sockets: Bringing Sockets to the Web
An introduction to Web Sockets, which let you do lots of cool real time things with the web. One of many things I need to spend more time experimenting with.

The Week in Links - 11/11/2010

Things You Should Do Immediately After Launching a Website
Some of these are common sense, but there are quite a few non-obvious ones here. A good checklist.

Running Shells in Emacs: An Overview | Mastering Emacs
Working with shells in Emacs is very useful; I almost always have a small one running at the bottom of my window to run commands in. This explains the differences between the different kinds of shells in Emacs, how to use them, and how to change their settings.

Announcing Cloud Load Balancing Private Beta | Rackspace Cloud Computing & Hosting
Rackspace Cloud, where I host a ton of different servers for myself and for clients, has announced a beta of their load balancing service. Good load balancing is a pain to set up, so this is promising.

The 1140px CSS Grid System/Framework · Fluid down to mobile
Nice new CSS grid framework that handles multiple screen sizes with ease. It seems like a fundamental failing of CSS that we need all these frameworks to do really basic stuff like this though.

Dr Nic’s Making CI easier to do than not to with Hudson CI and Vagrant
I need to spend some time with Hudson. It’s an incredibly powerful “Continuous Integration” server, but it does a lot more as well. This article explains how to use it in conjunction with Vagrant to automatically set up your test environment.

How to Use Your Zoom Lens as a Compositional Aid
I’ve been learning photography over the last couple of years. This article did a better job of explaining the effects of using different kinds of zoom lenses. The pictures that accompany the article are worth 1000 words and then some.