Cut average response time by three quarters using Redis

Shortly after launching this site using Sinatra and hosted on Heroku I was looking for ways to improve the performance. I looked through the add-on services Heroku offered and found Redis-to-go had a nice looking free tier. So I went ahead and implemented caching with Redis. Here's the results:

Before:

After:

You can see the from these New Relic graphs that the average response time over a 24 hour period has dropped from 20ms to 5ms.

The way went about this was rather than to store raw data in Redis, I stored the HTML for the whole rendered page. Then each route in Sinatra first checked for the page in Redis, returning it if it exists, or rendering and storing it if it didn't.

require 'rubygems'
require 'sinatra'
require 'digest/sha1'
require 'redis'

configure do
  uri = URI.parse(ENV["REDISTOGO_URL"])
  REDIS = Redis.new(:host => uri.host, :port => uri.port, :password => uri.password)
end

helpers do
  def is_cached
    tag = "url:#{request.url}"
    page = REDIS.get(tag)
    if page and !logged_in?
      etag Digest::SHA1.hexdigest(page)
      ttl = REDIS.ttl(tag)
      response.header['redis-ttl'] = ttl.to_s
      response.header['redis'] = 'HIT'
      return page
    end
  end
  
  def set_cache(page)
    etag Digest::SHA1.hexdigest(page)
    tag = "url:#{request.url}"
    response.header['redis'] = 'MISS'
    REDIS.setex(tag, 3600, page) if !logged_in?
    return page
  end
end

Here's the configuration and helpers being used make this work. First of all the required gems are loaded. Then in the configure section the Redis-to-go URL stored in an environment variable on Heroku is used to instantiate the Redis object. In the helpers section there are two functions defined, is_cached and set_cache. The is_cache function first of all queries Redis using a tag derived from the page URL, if the page is in the cache it sets an etag based on a hash of the page HTML and sets two headers, one with the Redis TTL and another specifying Redis was a hit, then finally it returns the page. The headers are only really being set to easily see if Redis is being used or not and for no functional reason. The set cache function is then only ever called if is_cache returns false, it again sets an etag for the page and then sets a header noting the Redis cache was missed, finally the page is cached in Redis with an age of 3600 seconds (1 hour) and returned. One thing to note is the use of !logged_in? throughout these functions, this comes from the use of sinatra-authentication and prevents the caching pages for authenticated users.

get '/' do
  html = is_cached
  if html
    return html
  end
  
  #Put any database queries or logic here

  html = erb :home
  set_cache(html)
end

Here's how these functions are being used in the route for the homepage. The is_cache function is called, if it's got something in it, it will be returned. Otherwise the code will continue any of the database queries will be executed and logic would be processed before rendering the page via the erb views. This is then set using the set_cache function, which also returns the html.

Most of the routes for the site are cached in exactly this way. Check the headers, are you getting a Redis HIT?

Published on: 21 November 2012
By
Tags: ruby, sinatra, heroku, redis,

Simple SEO PDF guide

Get our latest PDF guide, Simple SEO.

Twitter