Three DevOps Rake Tasks I Use Daily on Heroku with MongoDB

Back | deployment, mongoid, rake, heroku, mongodb | 2/2/2013 |

image

Heroku and MongoDB enable the kind of frictionless devops workflow every mature software organization wants and spends mountains of money on. It begins with a 1-box developer setup in which I can checkout source code and be ready to start development after a bundle install, and continues with a git push to a Heroku staging environment that deploys the application for last minute integration testing  and, finally, production. Each separate Heroku instance has its own configuration and resources. My code also needs sensible defaults for development. For example, mongoid.yml lists localhost:27017 as the development database host and ENV['MONGOHQ_HOST_LIST'] for production.

However, in true devops, I want to be able to reach into a remote environment programmatically, run commands remotely or connect two environments together, as long as we have access to them. In this post I’ll propose an implementation of three tasks that have become part of the daily routine of one of my projects.

  1. Execute a Rake task with local code modifications and the configuration of a production environment.
  2. Open a shell to the primary node of the MongoDB on my development Heroku environment.
  3. Dump a single table from a production database before doing something scary.

We’ll accomplish the above with help from two new gems: mongoid-shell and heroku-commander. Add those to your Gemfile in the :development section.

And, please remember, that with great power (and the word “production” in much of what follows) comes a lot of responsibility.

Execute a Rake task with local code modifications and the configuration of a production environment.

This is made possible by heroku-commander. The library wraps the Heroku CLI (intro here) and will run heroku config –s. It’s now easy to reach out to a Heroku application and retrieve its configuration programmatically without worrying about API keys (by default it will use the Heroku app defined via the “heroku” GIT remote). We will also need a bit of code to apply our application’s naming convention. This lets me change the execution environment to the one of a remote Heroku application, in Ruby.

  1. module Heroku
  2.   class Config < Hash
  3.     def self.set_env_from!(env)
  4.       app = case env.to_sym
  5.         when :heroku, :development then nil
  6.         else "app-#{env}"
  7.       end
  8.       require 'heroku-commander'
  9.       config = Heroku::Commander.new({ :app => app }).config
  10.       config.each do |k, v|
  11.         ENV[k] = v
  12.       end
  13.     end
  14.   end
  15. end

So how do I run a task locally, but configured as production? With the following Rake task.

  1. namespace :heroku do
  2.   desc "Load environment vars from Heroku config into ENV."
  3.   task :config_from_env do
  4.     env = ENV['RAILS_ENV'] || Rails.env
  5.     raise "RAILS_ENV or Rails.env must be specified" unless env
  6.     Heroku::Config.set_env_from! env
  7.   end
  8. end

Run RAILS_ENV=production rake heroku:config_from_env my:task.

Open a shell to the primary node of the MongoDB on my development Heroku environment.

First, figure out the remote MongoDB configuration, then execute the mongo shell command. It’s important to know that the built-in system command doesn’t raise an error when the process returns a non-zero status code. Lets add a system! function that fixes that.

  1. def system!(cmdline)
  2.   logger.info("[#{Time.now}] #{cmdline}")
  3.   rc = system(cmdline)
  4.   fail "failed with exit code #{$?.exitstatus}" if (rc.nil? || ! rc || $?.exitstatus != 0)
  5. end

Instead of making MongoDB command lines manually, I’ve used a new gem called mongoid-shell (intro here).

  1. namespace :db do
  2.   [ :staging, :production, :heroku ].each do |env|
  3.     namespace env do
  4.       task :shell do
  5.         require 'mongoid-shell'
  6.         Heroku::Config.set_env_from!(env)
  7.         config = File.join(Rails.root, "config/mongoid.yml")
  8.         Mongoid.load! config, env
  9.         system! Mongoid::Shell::Commands::Mongo.new.to_s
  10.       end
  11.     end
  12.   end
  13. end

Run rake db:production:shell.

Dump a single table from a production database before doing something scary.

We all do backups and other important things, daily. But when manipulating production data I want to have the last safeguard with the freshest data from the collection I am about to drop accidentally update. Dump a MongoDB collection locally.

  1. namespace :db do
  2.   [ :production, :staging, :heroku ].each do |env|
  3.     namespace env do
  4.       task :dump, [ :collections ] => :environment do |t, args|
  5.         require 'mongoid-shell'
  6.         Heroku::Config.set_env_from!(env)
  7.         config = File.join(Rails.root, "config/mongoid.yml")
  8.         Mongoid.load! config, env
  9.         collections = args[:collections].split(/[\s|,]+/)
  10.         collections.each do |collection|
  11.           system! Mongoid::Shell::Commands::Mongodump.new({ collection: collection }).to_s
  12.         end
  13.       end
  14.     end
  15.   end
  16. end

 

Points

Thanks to @joeyAghion and @fancyremarker who are responsible for many of the core concepts above.