A Comprehensive Guide to Understanding Your Industry and Customers

The business landscape is constantly evolving, with industries and customer preferences shifting at a rapid pace. To succeed in this dynamic environment, it’s crucial to have a deep understanding of…

Smartphone

独家优惠奖金 100% 高达 1 BTC + 180 免费旋转




Using Docker

September 17th, 2015

Docker is a container service that encapsulates microservices automates deployment of applications. If unfamiliar with containers, think of them as virtual machines (VMs) that only provide operating system level virtualization (on top of kernel), but unlike VMs, this does not extend to hardware level. At first look, it is a light-weight version of virtual machines. This greatly supports the microservices pattern, which encourages the splitting of a large application into services that communicate through TCP, APIs or a message queue (MQ). Docker also provides a Makefile-like deployment script (called Dockerfile), and its main development is towards clustering containers and optimizing their communication between each other.

We looked towards Docker when we thought of building our own infrastructure on AWS and moved away from Heroku. To dockerize our application, we had to use Docker Compose, previously fig, a tool that as of this writing is not production-ready.

The ideal setup for Docker is to have a single command to build your entire project, and another to start the server. There should be abilities to manipulate the container to run commands, like seeding the database (which shouldn’t be ran every time you deploy!). Additionally, the commands should be the same for deploying remotely.

We will setup the obvious infrastructure by putting the database in a container and the Rails application in another. We will link them so that they can communicate to each other (and taking advantage that Postgres can be accessed over a TCP port). We will see that the host will not be localhost. The linking is where Docker Compose comes in, which implements an almost recursive building process.

We will be using the basic Ruby base image, which is based on Ubuntu (with Debian repository). We also want Docker to cache any updates and gem installations, so we put them all in the build process (not the run process later).

This file is quite straight forward: we use the Ubuntu-based image, and update it; we make our application directory from root, and make it our working directory. Making the folder our working directory means that any command which we inject inside the container will run here. We then add the Gemfile and install all the bundles. Docker caches this into a new container, and then we add the rest of the application files to the container. Note that Docker cannot cache the ADD command, so we try to leave it as late as possible.

A bug that I had before is that my local Gemfile.lock is not synchronized with the most current Gemfile dependencies. As a result, when a different Gemfile.lock is added to the container, it looks for gem versions that were not previously installed (with only the Gemfile). This causes problems and to fix them, simply ignore the lock:

Next, we want to configure our compose file. Here, I used a common.yml file as a base, which we will reuse later when creating a different compose file for production.

Already, we can start building the application:

So what exactly does docker-compose build do? Note that we defined a build command in common.yml, which simply points to the current directory. Every other command is configured for running the container. Thus docker-compose build simply translates to docker build ., which runs the commands in the Dockerfile found in the current directory. You will see that Docker first skips the building of the db because it is simply an image (it will be built when ran). Then, Docker follows every instruction in the Dockerfile.

Before running this server, we need to change up the database configuration. Postgres’s host is no longer localhost but db, so we need to configure it that way. To make this application still runnable with traditional methods, we will put it as an environment variable (we set its value inside docker-compose.yml)

Now we can run (and stop) the application.

Rails might throw an exception and kill itself when you attempt this, and it might be because the database is not created. Then simply create it by running commands inside the container:

If the problem is something like “Gem not found”, then remove Gemfile.lock from the local directory. Don’t worry, after rebuilding the container, a new lock will be created.

We might want to connect to the rails console, which is as simple as running a command:

Docker Machine allows us to deploy to a remote server easily. Here I will simply show the commands.

Add a comment

Related posts:

How VPN Works

Virtual Private Network is the name of this technology. It is a remote connection technology that allows users to connect to a network “virtually”, as if they were physically plugged in. VPN allows…

2020 Q4 Interchain Foundation Assets and Funding Overview

The Interchain Foundation (ICF) is delighted to announce the publication of key financial figures and funding activity for the last quarter of 2020. The Foundation maintains a very professional and…

roots of revolution

Revolution is as American as apple pie and the Blues. We must seek as a nation to understand and deconstruct the roots of the American revolution to address insurrection.