Deploy Docker Containers from Docker Cloud

0 votes

I'm new to Docker and am trying to learn more about best practices for deploying Dockerized images. I've built some images on my development host using the Dockerfile and docker-compose.yml below.

After building the images, I ssh'd to my production server, an Amazon Linux flavored T2.micro instance on AWS's EC2 service. There I installed docker and docker-compose, then tried to build my images, but ran out of RAM. I therefore published the images I had built on my local host to Docker Cloud, and I now wish to deploy those images from Docker Cloud on the AWS instance.

How can I achieve this? I'd be very grateful for any help others can offer!

Dockerfile:

# Specify base image
FROM andreptb/oracle-java:8-alpine

# Specify author / maintainer
MAINTAINER Douglas Duhaime <douglas.duhaime@gmail.com>

# Add source to a directory and use that directory
# NB: /app is a reserved directory in tomcat container
ENV APP_PATH="/lts-app"
RUN mkdir "$APP_PATH"
ADD . "$APP_PATH"
WORKDIR "$APP_PATH"

##
# Build BlackLab
##

RUN apk add --update --no-cache \
  wget \
  tar \
  git

# Store the path to the maven home
ENV MAVEN_HOME="/usr/lib/maven"

# Add maven and java to the path
ENV PATH="$MAVEN_HOME/bin:$JAVA_HOME/bin:$PATH"

# Install Maven
RUN MAVEN_VERSION="3.3.9" && \
  cd "/tmp" && \
  wget "http://archive.apache.org/dist/maven/maven-3/$MAVEN_VERSION/binaries/apache-maven-$MAVEN_VERSION-bin.tar.gz" -O - | tar xzf - && \
  mv "/tmp/apache-maven-$MAVEN_VERSION" "$MAVEN_HOME" && \
  ln -s "$MAVEN_HOME/bin/mvn" "/usr/bin/mvn" && \
  rm -rf "/tmp/*"

# Get the BlackLab source
RUN git clone "git://github.com/INL/BlackLab.git"

# Build BlackLab with Maven
RUN cd "BlackLab" && \
  mvn clean install

##
# Build Python + Node dependencies
##

# Install system deps with Alpine Linux package manager
RUN apk add --update --no-cache \
  g++ \
  gcc \
  make \
  openssl-dev \
  python3-dev \
  python \
  py-pip \
  nodejs

# Install Python dependencies
RUN pip install -r "requirements.txt" && \
  npm install --no-optional && \
  npm run build

# Store Mongo service name as mongo host
ENV MONGO_HOST=mongo_service
ENV TOMCAT_HOST=tomcat_service
ENV TOMCAT_WEBAPPS=/tomcat_webapps/

# Make ports available
EXPOSE 7082

# Seed the db
CMD npm run seed && \
  gunicorn -b 0.0.0.0:7082 --access-logfile - --reload server.app:app

docker-compose.yml:

version: '2'

services:
  tomcat_service:
    image: 'bitnami/tomcat:latest'
    ports:
      - '8080:8080'
    volumes:
      - docker-data-tomcat:/bitnami/tomcat/data/
      - docker-data-blacklab:/lts-app/lts/

  mongo_service:
    image: 'mongo'
    command: mongod
    ports:
      - '27017:27017'

  web:
    # gain access to linked containers
    links:
      - mongo_service
      - tomcat_service
    # explicitly declare service dependencies
    depends_on:
      - mongo_service
      - tomcat_service
    # set environment variables
    environment:
      PYTHONUNBUFFERED: 'true'
    # use the image from the Dockerfile in the cwd
    build: .
    ports:
      - '7082:7082'
    volumes:
      - docker-data-tomcat:/tomcat_webapps
      - docker-data-blacklab:/lts-app/lts/


volumes:
  docker-data-tomcat:
  docker-data-blacklab:

Sep 3, 2018 in AWS by bug_seeker
• 15,350 points
189 views

1 answer to this question.

0 votes

To solve this problem, I followed advice from StackOverflow user @MazelTov's and built the containers on my local OSX development machine, then published the images to Docker Cloud, then pulled those images down onto and ran the images from my production server (AWS EC2).

Install Dependencies

I'll try and outline the steps I followed below in case they help others. Please note these steps require you to have docker and docker-compose installed on your development and production machines. I used the gui installer to install Docker for Mac.

Build Images

After writing a Dockerfile and docker-compose.yml file, you can build your images with docker-compose up --build.

Upload Images to Docker Cloud

Once the images are built, you can upload them to Docker Cloud with the following steps. First, create an account on Docker Cloud.

Then store your Docker Cloud username in an environment variable (so your ~/.bash_profileshould contain export DOCKER_ID_USER='yaledhlab' (use your username though).

Next login to your account from your developer machine:

docker login

Once you're logged in, list your docker images:

docker ps

This will display something like:

CONTAINER ID        IMAGE                          COMMAND                  CREATED             STATUS              PORTS                      NAMES
89478c386661        yaledhlab/let-them-speak-web   "/bin/sh -c 'npm run…"   About an hour ago   Up About an hour    0.0.0.0:7082->7082/tcp     letthemspeak_web_1
5e9c75d29051        training/webapp:latest         "python app.py"          4 hours ago         Up 4 hours          0.0.0.0:5000->5000/tcp     heuristic_mirzakhani
890f7f1dc777        bitnami/tomcat:latest          "/app-entrypoint.sh …"   4 hours ago         Up About an hour    0.0.0.0:8080->8080/tcp     letthemspeak_tomcat_service_1
09d74e36584d        mongo                          "docker-entrypoint.s…"   4 hours ago         Up About an hour    0.0.0.0:27017->27017/tcp   letthemspeak_mongo_service_1

For each of the images you want to publish to Docker Cloud, run:

docker tag image_name $DOCKER_ID_USER/my-uploaded-image-name
docker push $DOCKER_ID_USER/my-uploaded-image-name

For example, to upload mywebapp_web to your user's account on Docker cloud, you can run:

docker tag mywebapp_web $DOCKER_ID_USER/web
docker push $DOCKER_ID_USER/web

You can then run open https://cloud.docker.com/swarm/$DOCKER_ID_USER/repository/list to see your uploaded images.

Deploy Images

Finally, you can deploy your images on EC2 with the following steps. First, install Docker and Docker-Compose on the Amazon-flavored EC2 instance:

# install docker
sudo yum install docker -y

# start docker
sudo service docker start

# allow ec2-user to run docker
sudo usermod -a -G docker ec2-user

# get the docker-compose binaries
sudo curl -L https://github.com/docker/compose/releases/download/1.20.1/docker-compose-`uname -s`-`uname -m` -o /usr/local/bin/docker-compose

# change the permissions on the source
sudo chmod +x /usr/local/bin/docker-compose

Log out, then log back in to update your user's groups. Then start a screen and run the server: screen. Once the screen starts, you should be able to add a new docker-compose config file that specifies the path to your deployed images. For example, I needed to fetch the let-them-speak-webcontainer housed within yaledhlab's Docker Cloud account, so I changed the docker-compose.ymlfile above to the file below, which I named production.yml:

version: '2'

services:
  tomcat_service:
    image: 'bitnami/tomcat:latest'
    ports:
      - '8080:8080'
    volumes:
      - docker-data-tomcat:/bitnami/tomcat/data/
      - docker-data-blacklab:/lts-app/lts/

  mongo_service:
    image: 'mongo'
    command: mongod
    ports:
      - '27017:27017'

  web:
    image: 'yaledhlab/let-them-speak-web'
    # gain access to linked containers
    links:
      - mongo_service
      - tomcat_service
    # explicitly declare service dependencies
    depends_on:
      - mongo_service
      - tomcat_service
    # set environment variables
    environment:
      PYTHONUNBUFFERED: 'true'
    ports:
      - '7082:7082'
    volumes:
      - docker-data-tomcat:/tomcat_webapps
      - docker-data-blacklab:/lts-app/lts/

volumes:
  docker-data-tomcat:
  docker-data-blacklab:

Then the production compose file can be run with: docker-compose -f production.yml up. Finally, ssh in with another terminal, and detach the screen with screen -D.

answered Sep 3, 2018 by Priyaj
• 56,900 points

Related Questions In AWS

0 votes
1 answer
+1 vote
2 answers
0 votes
2 answers

Receiving SMS from users and stores in AWS

As far as I know, receiving international ...READ MORE

answered Aug 21, 2018 in AWS by Priyaj
• 56,900 points
83 views
+5 votes
2 answers

Can we export/migrate users from AWS cognito, does it cause vendor lock-in?

Cognito actually has the capability to import ...READ MORE

answered Aug 1, 2018 in AWS by bug_seeker
• 15,350 points
895 views
0 votes
1 answer
+13 votes
2 answers

Git management technique when there are multiple customers and need multiple customization?

Consider this - In 'extended' Git-Flow, (Git-Multi-Flow, ...READ MORE

answered Mar 26, 2018 in DevOps & Agile by DragonLord999
• 8,380 points
180 views
0 votes
1 answer

How to deploy Spring Boot RESTful Web Service Docker img to EC2?

Typically I run application on separate port ...READ MORE

answered Sep 7, 2018 in AWS by Priyaj
• 56,900 points
752 views
0 votes
1 answer

How PCF (Pivotal Cloud Foundry) is different from AWS (Amazon Web Services)

PCF is a commercial cloud platform (product) ...READ MORE

answered Sep 11, 2018 in AWS by Priyaj
• 56,900 points
1,108 views