Continuous Delivery for Meteor on Elastic Beanstalk via Docker

Few months ago, a Node.js project I was working on made a switch from Express to Meteor. The project was running on AWS Elastic Beanstalk and the continuous delivery procedure we had in place on Jenkins relied pretty heavily on Elastic Beanstalk and other AWS services. The first next step I took was look for the fastest and least painful way to accommodate Meteor in that procedure (i.e. the less changes the better). The first thing I discovered was that Elastic Beanstalk’s native Node.js stack doesn’t support Meteor out of the box, and there was no straightforward way to make it work.

Elastic Beanstalk is a PaaS solution from AWS which allows simple deployment for various platforms, Docker being one of them. Since Meteor is not one, Docker seemed like a good fit. The idea behind Docker is that once we have a built image, we can run a container based on that image on any server that has Docker running independently of the underlying platform or OS. While writing our Dockerfile which would build a Docker image with Meteor wouldn’t be too complicated, the good folks at MeteorHacks provide a great set of base images for super simple image building. To build a Docker image of our Meteor project, we simply have to add a Dockerfile with this content in the project root:

Now we can build our Docker image by simply running:

Elastic Beanstalk (EB) uses a specific file for defining Docker deployments. We’re going to add a template of that file (named to our project root with these contents:

Authentication – This part is only necessary if the Docker image is private; it tells EB on which S3 bucket to find the Docker auth file (~/.dockercfg on your local machine)
Image – the repository and name of the image which EB will pull and start, also tells EB whether to attempt to pull the latest version of the image. We’ll populate ${IMAGE_TAG} later on in Jenkins.

Once we’ve added the files above to our project and made sure that docker build works, we can proceed to automating everything in Jenkins. We’ll use aws cli to interact with AWS (assuming it’s configured with proper IAM credentials, or even better that Jenkins is running on an EC2 instance with an IAM role). In the Execute Shell block of a Jenkins build we’ll add the following shell script:

With this Jenkins job in place, each new commit to the project repository will result in a Docker image built and pushed to the Docker Hub, and the same image deployed to our Elastic Beanstalk environment.

Since this setup will most likely result in frequent deployments on the EB environment, it’s a good idea to clean up old Docker images. We can do that by adding an EB config file at .ebextensions/01dockercleanup.config in the project root:

This file will create a post-deploy hook script on the Elastic Beanstalk instance which will clean up all unused images with each deploy.

Follow me

Dario Duvnjak

Throughout the years I've experienced working in all parts of the software development process on different tech stacks and with clients from all over the world. Always looking out for the next great tool in cutting edge open source technologies. I specialize in Cloud computing (AWS), Ruby, Node.js, DevOps, Docker and Puppet.
Follow me
Dario DuvnjakContinuous Delivery for Meteor on Elastic Beanstalk via Docker