[Day 1] JS in Pipeline - DevOps for Local Development Environment (1)


[Day 1] JS in Pipeline - DevOps for Local Development Environment (1)

The goal of this series is to introduce some best practices in the local development environment and create a CI/CD pipeline for NodeJS applications.

In this series, first I will explain what DevOps is and what does it matter.


When you are developing, sometimes (or very often?) you might encounter lots of difficulties, such as: Code works on your colleague's laptop but doesn't work on yours, or vice versa. 

You would tell your colleague: "What's your node version? Run node --version please? Mine is 10.12.0."

Your colleague: "Okay, let me check... Okay, it is 10.13.0."

You: "I don't think the difference between 10.12 and 10.13 causes this problem. What about your npm version?"

After you both checked the npm version, npm version doesn't seem a problem either. Then you would rm -rf node_modules , npm install.

Still not working? Then you would start doubting if it is caused by your macOS version.

It is extremely annoying and sometimes it can waste lots of your time.

How could we get rid of this problem? How to make everyone's local development environment (almost) the same?


DevOps is a set of practices that combines software development (Dev) and information-technology operations (Ops) which aims to shorten the systems development life cycle and provide continuous delivery with high software quality. 
 - Wikipedia https://en.wikipedia.org/wiki/DevOps

So, DevOps can help you shorten your development life cycle and save you lots of time. We will walk through the traditional development process (ie. BEFORE) and the improved development process which is more "DevOps" (ie. AFTER).

Let's do by example. Our example NodeJS application for the whole series is an API server and a MySQL database.

The application is a "Gift Code" system. A user can redeem a valid gift code for credits on our website.

The table looks like:
id (int, primary, auto-increment) code(string, index) status (enum: active, inactive, redeemed) credits (int, eg. 10 (euros))

We need to develop 2 APIs, one is POST /api/giftcode/{giftcode} - redeem your gift code and the other is POST /api/giftcode/generate - generate new gift codes.

So, the first step is to install MySQL on your machine, right?

BEFORE: download MySQL binaries/installer and install it on your machine. Create a MySQL user, database, and tables. Start Developing.

Problems: You might run into similar problems. Your colleague will need to do the same. Download and install. And maybe one day your versions would diverse and it might cause doubts or problems. Then your architecture would be more complicated - need to install Redis, RabbitMQ,… You could not set up all dependencies smoothly.

So, how DevOps can help us improve?

AFTER: use Docker and Docker-Compose


So what is Docker?

Everyone talks about Docker nowadays. There are tons of articles about what is Docker. Here are some:

https://docs.docker.com/engine/docker-overview/
https://www.redhat.com/en/topics/containers/what-is-docker

Simply speaking, Docker "packs up" your application in an environment you decide.

  1. Create a Docker "image" that contains your application.
  2. Start a Docker "container" based on the image and it runs your application.

Let's learn it by our Dockerfile.

Before diving into Dockerfile, we also need to create a .dockerignore file. It is just like .gitignore file. The .dockerignore file tells Docker to not copy your local node modules and debug logs into your image.

node_modules
npm-debug.log

Okay, finally, we can look at our Dockerfile!

FROM node:10.19.0-alpine3.11
# Create app directory & will copy our source code here
WORKDIR /usr/src/app
# First, copy the package-lock.json and package.json,
# then install dependencies "INSIDE" the image
COPY package*.json ./
RUN npm install
# Copy all other source code
COPY . .
# Start our server
CMD [ "node", "index.js" ]

https://github.com/jeanycyang/js-in-pipeline/blob/82f74f7bdf61ac4a049fa1a23997947d186fba00/Dockerfile

FROM image:tag decides what image our own image is based on. Let's choose one NodeJS's official image - node. The tag 10.19.0-alpine3.11 means this NodeJS image uses NodeJS 10.19.0 and it is built on Alpine (one of the most popular Linux distributions). It means that our NodeJS version will be set to 10.19.0 and remain unchanged. The problem "What's your node and npm version?" has been solved!
You can find more NodeJS Docker images here: https://hub.docker.com/_/node/

[WORKDIR](https://docs.docker.com/engine/reference/builder/#workdir) will create a directory if it doesn't exist yet. It sets the working directory for any RUN, CMD, ENTRYPOINT, COPY and ADD in the Dockerfile.

After setting our working directory, we then copy the package.json and package-lock.json files in our image.

This is one of Docker's killer-features.

On our own machines, sometimes we encounter difficulties installing node_modules. It might be a problem with your OS or OS version, or node version, or something unknown. (You always want to rm -rf your node_modules and reinstall and hope it will fix all problems, right?:))

But now, we copy package.json and package-lock.json in the image. This image is based on Alpine Linux. More precisely, it is Alpine 3.11.

Let's RUN npm install. All node_modules now is installed in the Alpine 3.11 OS. If you have any other dependencies, you can install them in the image as well. For example, RUN apk add ffmpeg. (apk is the default packages management of Alpine. Like apt for Ubuntu and rpm for CentOS.)

That's also why we put node_modules in our .dockerignore file. Our dependencies should be downloaded and installed in a controlled environment. The dependencies should never depend on the developer's local environment (eg. macOS).

Now, not only the node version but also the base OS is set and will remain unchanged unless you change your Dockerfile. One other advantage is, you can track the changes by version control tool, eg. Git.

Every team member should keep an eye on the image. This image now becomes "the source of truth" for every team member. A change should NEVER break the image build.

Last but not least, we copy all other source code and CMD tells that once a container is created based on this image, it will automatically run node index.js.

In your terminal, let's run docker build -t giftcodeserver . in the project directory. The -t option means this image will be tagged as "giftcodeserver".
The docker-build command searches the file named Dockerfile in the current directory ( . ). The command is the same as docker build -t giftcodeserver ./Dockerfile.

$ docker build -t giftcodeserver .
Sending build context to Docker daemon  153.1kB
Step 1/6 : FROM node:10.19.0-alpine3.11
10.19.0-alpine3.11: Pulling from library/node
c9b1b535fdd9: Pull complete
514d128a791d: Pull complete
ab9dddf2630f: Pull complete
acb767e231ef: Pull complete
Digest: sha256:e8d05985dd93c380a83da00d676b081dad9cce148cb4ecdf26ed684fcff1449c
Status: Downloaded newer image for node:10.19.0-alpine3.11
---> 29fc59abc5de
Step 2/6 : WORKDIR /usr/src/app
---> Running in c542c10760a7
Removing intermediate container c542c10760a7
---> b313f3d1ed3b
Step 3/6 : COPY package*.json ./
---> 792234e2f673
Step 4/6 : RUN npm install
---> Running in be9ae6e320c0
added 96 packages from 143 contributors and audited 167 packages in 2.536s
found 0 vulnerabilities


Removing intermediate container be9ae6e320c0
---> d40d9445896d
Step 5/6 : COPY . .
---> b2fedc170681
Step 6/6 : CMD [ "node", "index.js" ]
---> Running in 6833670c5f76
Removing intermediate container 6833670c5f76
---> 3cc97341f395
Successfully built 3cc97341f395
Successfully tagged giftcodeserver:latest

We have built our image. In the next article, we will dive deeper into Docker for dev and Docker-Compose. Later we can use Docker-Compose to set up our development environment with but one-click!

References/Useful links:

https://nodejs.org/de/docs/guides/nodejs-docker-webapp/
https://docs.docker.com/engine/reference/builder/
https://hub.docker.com/_/node/

#devops #docker #docker-compose #nodejs #Node
JS in Pipeline
The goal of this series is to introduce some best practices in the local development environment and create a CI/CD pipeline for NodeJS appl






Related Posts

原子習慣:法則 4 - 讓獎賞令人滿足

原子習慣:法則 4 - 讓獎賞令人滿足

sunnyleeyun
介紹Promise和Async/Await

介紹Promise和Async/Await

Andy Tsai
Leetcode 刷題 pattern - Merge Intervals

Leetcode 刷題 pattern - Merge Intervals

Po-Jen


Comments