CI/CD PIPELINE
The CI/CD pipeline is one of the best practices for devops teams to implement, for delivering code changes more frequently and reliably
What is CI/CD ?
Continuous integration is a coding philosophy and set of practices that drive development teams to implement small changes and check in code to version control repositories frequently. Because most modern applications require developing code in different platforms and tools, the team needs a mechanism to integrate and validate its changes.
The technical goal of CI is to establish a consistent and automated way to build, package, and test applications. With consistency in the integration process in place, teams are more likely to commit code changes more frequently, which leads to better collaboration and software quality.
Continuous delivery picks up where continuous integration ends. CD automates the delivery of applications to selected infrastructure environments. Most teams work with multiple environments other than the production, such as development and testing environments, and CD ensures there is an automated way to push code changes to them.
CI/CD tools help store the environment-specific parameters that must be packaged with each delivery. CI/CD automation then performs any necessary service calls to web servers, databases, and other services that may need to be restarted or follow other procedures when applications are deployed.
If we choose to make CI/CD pipeline on our own, then we face issue of less compute unit , this problem can be easily solved with use of cloud computing . But here we will discuss other method to solve this which is dynamic distributed jenkins clusters .
Let me tell you which practical we are performing :
1. Create a container image that has Linux and other basic configuration required to run Slave for Jenkins. ( example here we require kubectl to be configured )
2. When we launch the job it should automatically start a job on slave based on the label provided for a dynamic approach.
3. Create a job chain of job1 & job2 using the build pipeline plugin in Jenkins
4. Job1: Pull the Github repo automatically when some developers push the repo to Github and perform the following operations as:
- 1. Create the new image dynamically for the application and copy the application code into that corresponding docker image
- 2. Push that image to the docker hub (Public repository) ( Github code contain the application code and Dockerfile to create a new image )
5. Job2 ( Should be run on the dynamic slave of Jenkins configured with Kubernetes kubectl command): Launch the application on the top of Kubernetes cluster performing following operations:
- 1. If launching the first time then create a deployment of the pod using the image created in the previous job. Else if deployment already exists then do a rollout of the existing pod making zero downtime for the user.
- 2. If Application created the first time, then Expose the application. Else don’t expose it.
So Let us start our task now .
Step 1: Creating Container image that has kubectl command and some other basic configuration done .
I am also uploaded this image to docker hub see here
We also have to make some changes in config file of kubectl .
We are also copying the yaml file using which we will create the deployment in kubernetes .
Step 2: We need to configure Docker Service from the localhost to work as a Client and not as a Server. We need to set this to use docker remotely. i used .8888 port ,To configure edit the file /usr/lib/systemd/system/docker.service as follows:
After making changes to this file we need to restart the daemon , and also need to restart the docker . Use the below commands to do this .
systemctl daemon-reload systemctl restart docker export DOCKER_HOST=<localhost ip_address>:8888
Step3: Now we have to configure the cloud in Jenkins
While doing this we have to also give the docker account creditial in registry authentication part .
So here our environment is ready for work, we have to create the jenkins jobs now .
Step 4: Building and pushing the image to Docker registry
I have put the Dockerfile in my github repo , and we will create the jenkins job which will download the github repo automatically whenever something got changed in github repo. and after downloading the github this job will also create the new image using Dockerfile present in repo and will also push to docker registry .
Step 5: Now we have to just make the deployment.
Viola Deployment is created .
We can also see the deployment from windows
I also created a pipeline view to see the live jobs running .
Let us see our webpage
In future if the load over site increases we can also set loadbalancer which is inbuilt provided by kubernetes service , we can also add more nodes to distribute load over the nodes . We can also deploy dynamic websites .
I Have done this task with my friend Atharva Patil .
Big thanks to Vimal Daga Sir for this amazing knowledge .
Thank you !!!
6 x Salesforce Certified Developer | 2 x Flosum Certified | 1 x Copado Certified Developer
4 年Well done shyam sulbhewar
Devops Engineer at Think Future technologies
4 年Great job
Software Engineer - Cloud Engineer at CRISIL Limited
4 年Excellent work dude ????
SE @TCS Digital | AWS Cloud & DevOps Engineer | Also skilled in NodeJS / MERN Full Stack and Python automation
4 年Great work shyam sulbhewar
Mainframe Developer at Wipro | Microsoft Certified Azure Administrator Associate | Cloud & Data Enthusiast.
4 年Nice ??