Local Development With Docker

This is part 1/2 on development with Docker. In this article, I'll be covering how and why you should be using Docker in development. In the next one, I'll cover production deployments.


When building a typical project at work, I need a few things. Node, a bunch of node modules (npm), and maybe more than that down the road. If someone on the team needs to make a change to my codebase that usually works on a different area of our tech stack, they tend to get annoyed at the setup process. Maybe they are on an outdated version of node, need to npm install because they haven't ran the repo in so long, or some other dependency was introduced. This can be frustrating when all you want to do is code, not worry about environment setup. It doesn't matter what your tech stack is, Docker can get rid of all of the problems you have. So why should you follow this post? Because none of these things will be problems ever again.


I'm on a Mac, and thankfully Docker support has gotten a lot better as of late. Head over here and download the desktop version for your OS.

Dockerizing your app

In my example above, I am building a simple app, using create-react-app. We'll continue with this example, although I hope you understand that you can change this and do anything with it. We need to add a Dockerfile to the repo. The complete version looks like this:

FROM node:8.6.0

# copy package.json for npm install
COPY package.json package.json
COPY package-lock.json package-lock.json

RUN npm install

COPY . .

CMD ./node_modules/.bin/npm-run-all -p watch-css start-js

This is all you really need to get a react app running. The first line is the base docker image. It can be found on the Docker Hub. It's a container with node preinstalled, that's about it! You'll notice I copy the package.json over, run npm install, then copy the rest of the project. Always run things that change the least at the top of your Dockerfile. This way, npm install will only be ran if your package.json is changed when you are building your container. After that, I am using npm-run-all because our designer threw that in the project. This way sass can be compiled while running the server. With a base create-react-app you would do CMD ./node_modules/.bin/react-scripts start.

Our First Build!

Now we can build our container and see if it works! Run this command inside of your project directory:

docker build -t test-app .

It should take a minute if you are using my example, because it has to run the npm install for the first time. Consecutive builds will be almost instant if you are not changing your package. Now let's start up our app and see if it works:

docker run --rm -it -v $(pwd)/src:/src -p 3000:3000 test-app

Here's the breakdown:

  • --rm: Kill the container when we cancel out of this command
  • -it: Enables interactive mode, for seeing console output. -t is a bit confusing but here's a good SO answer on it.
  • -v: Sync the src folder on our local system to the container! This is huge and only recently has good support on Macs. This way our dev server can live update.
  • -p Bind the container port to our localhost LOCAL:CONTAINER
  • The last argument is the name of the container we built.

It works!

At this point we have a working docker image that runs our React app, without anything needed on our host machine. Anyone running this image will not need to worry about running an npm install, anything going wrong, or have to worry about having the latest version of node. When you get into bigger apps, this is an even bigger deal.

Sharing Images

Now that we have it working, we need to upload our image to a private repository so that other team members can pull down the latest container any time they work on your project. I'm going to use Google Container Repository since the next post will be on production deployments with Kubernetes.

GCR Setup

Signup for Google Cloud, install the gcloud cli and authenticate. The container repository only charges for the storage you use, to give you an idea this example container is around 200mb and the current sotrage rate is $0.026 per GB per month. Now that we setup Google Cloud, we can deploy our app to the private registry:

docker build -t gcr.io/PROJECT_ID/test-app .
gcloud docker -- push gcr.io/PROJECT_ID/test-app

Now our container is on Google Cloud, to pull it down, we can do this:

gcloud docker -- pull gcr.io/PROJECT_ID/test-app

Replace the project id with the one you created in Google Cloud when you signed up.

Development Workflow

Now that we have pushing to Google working, we need a better workflow so that team members are always using the latest container. In the case of a JS / Node based app, we can add those two commands above in our package.json:

"scripts": {
    "deploy-dev": "docker build -t gcr.io/project/app . && gcloud docker -- push gcr.io/project/app",
    "run-dev": "gcloud docker -- pull project/app && docker run -v  $(pwd)/src:/src -p 3000:3000 --rm gcr.io/project/app",

When you add a new dependency to your app, run npm run deploy-dev to rebuild the container and upload it to Google Cloud. Any time you run your app in development, simply do npm run run-dev to check for any changes to the container and run it! Have all team members install the gcloud cli tool and docker, then all your apps will just work.


Setting up Docker for all of a teams' apps will help speed up development time. If you continually work on different projects with many different dependencies that must be installed, it can get tiresome to set all of those things up when switching between projects. If the lead developer takes a little time to build a Dockerfile and a private registry, these problems will no longer exist. Stay tuned for my next post on deploying to production using Google Container Engine which utilizes kubernetes. I'll also be covering docker-compose. The amount of work we did here is just small enough that I didn't think it was needed. As always, tweet me @zachcodes or leave a comment with any feedback.