Continuous Integration & Automation with Jenkins Docker & GitHub - DevOps Project
Pranav Shekhar
Deloitte US India | MLOps | DevOps - AL | Web Dev - MERN | Flutter & Firebase | Cloud - Hybrid & Multi | GAIT - AIR 50
This article is regarding continuous integration(CI) & automation using Jenkins as the tool for integration, Docker as the tool for deploying lightweight and fast apache2 containers to our application, keeping in mind the optimum utilization of hardware and memory resources and Git & GitHub for automating our workflow commits.
We will be deploying our Hero web page to the light weight containers in two different environments for the developer and the production respectively. In a nutshell, the developer will push to his environment and a quality testing team will check to his changes with respect to the already deployed production website, if the changes don't affect the performance of the website, then an automatic trigger will be build to merge the developer's feature branch on GitHub to the master branch of production on GitHub and the changes will be reflected in a second to our production.
This is our website currently running in the production environment at port 8081, We can also use a public IP through a tunneling tool - ngrok which will provide a public IP to our production environment for the clients to access it publicly.
Git local repository initialization and connecting it to remote GitHub origin on the web :-
echo "# Create a Local Repo Name" >> README.md git init git add README.md git commit -m "first commit" git remote add orgin https://github.com/PranavShekhar13/Jenkins_Project.git git git push -u origin master
Now we move on to the core concepts involved in this project, the developer is working on a feature branch of the same repo known as dev1 and he wishes to make some changes and merge to the master branch but he will have to go through a quality check from Quality Testing Team before he can merge. To automate this flow firstly we will be using a minor update to trigger git push on commits using git hooks. This will done from the following code on Git Bash:-
Save the txt file as "post-commit" with not the txt extension or use the following code snippet:-
KIIT@BTECH1828258 MINGW64 ~/Desktop/Jenkins-Project/.git/hooks (GIT_DIR!) $ mv post-commit.txt post-commit
Now comes launching the containers in two different production and development environments. Use the following code snippets on a Linux Machine with pre-installed Docker:- Here we are launching the containers in two isolated environments and mounting in the two different local volumes of the developer and production with the same remote origin on GitHub:-
[root@localhost /]# docker run dit -p 8081:80 --name master -v /WebProjectMaster:usr/local/apache2/htdocs/ httpd
Here the path usr/local/apache2/htdocs/httpd is the document root of our web server image and we use -p option for Port Address Translation(PAT) which ensures our GET request is translated to port -80 of the container. Now the snippet for our developer environment:-
[root@localhost /]# docker run dit -p 8082:80 --name dev -v /WebProjectDev:usr/local/apache2/htdocs/ httpd
Now our containers running in 2 different environments we will integrate their workflow using Jenkins for continuous integration, the respective jobs which will automate the workflow are:-
- Job 1 :- Developer pushes to the developer environment and the his changes are deployed to the developer environment running in a separate isolated container. The configuration for the Job 1 on Jenkins can be analysed below, we will be using PollSCM for regular intervals deployment from GitHub in automation:-
The triggers will be build automatically after every one minute using crontab syntax:-
We will run our shell command on job run one for copying our files into local git repository from remote origin repository at GitHub in the developer branch and another for our containers run check and launch if not running.
- Job 2:- The Production Environment will be running in this job and this job will execute only when the Quality Testing Team checks the developer code and gives it a check and a fit for deployment to production. We will use Job chaining concept here in whivh the production environment (Master_Env) will be a down stream project of Job 3 which is for Quality Testing, and the production environment build will merge to the developer's code only if the Quality Testing Build is stable. We will be again using PollSCM for our build synchronisation.
We will again run our shell commands on job build and for running containers check:-
- Job 3:- This Job will be run manually by the quality testing team and it will be be a remote login build through authentication through a token and credentials of the testing developer and a merge before build additional behavior will be selected for automation and ease for the developer.
We will select a remote trigger which will require an authentication token for the testing team given by the enterprise :-
Finally we will push to the remote origin at GitHub using Post Build Triggers to the automate the workflow once again:-
Bull's Eye!!! Your completely automated setup is done and ready to be tested. Make changes in the developer's branch and commit which will automatically push to the remote origin at GitHub due to our git hook in place, see the change on port 8082, push to the master with the developer's code and test for quality through remote login and then see the same changes being reflected on the production.
This was from my side.