CI/CD PIPELINE

CI/CD PIPELINE

The CI/CD pipeline is one of the best practices for devops teams to implement, for delivering code changes more frequently and reliably

What is CI/CD ?

Continuous integration is a coding philosophy and set of practices that drive development teams to implement small changes and check in code to version control repositories frequently. Because most modern applications require developing code in different platforms and tools, the team needs a mechanism to integrate and validate its changes.

The technical goal of CI is to establish a consistent and automated way to build, package, and test applications. With consistency in the integration process in place, teams are more likely to commit code changes more frequently, which leads to better collaboration and software quality.

Continuous delivery picks up where continuous integration ends. CD automates the delivery of applications to selected infrastructure environments. Most teams work with multiple environments other than the production, such as development and testing environments, and CD ensures there is an automated way to push code changes to them.

CI/CD tools help store the environment-specific parameters that must be packaged with each delivery. CI/CD automation then performs any necessary service calls to web servers, databases, and other services that may need to be restarted or follow other procedures when applications are deployed.

If we choose to make CI/CD pipeline on our own, then we face issue of less compute unit , this problem can be easily solved with use of cloud computing . But here we will discuss other method to solve this which is dynamic distributed jenkins clusters .

Let me tell you which practical we are performing :

1. Create a container image that has Linux and other basic configuration required to run Slave for Jenkins. ( example here we require kubectl to be configured )

2. When we launch the job it should automatically start a job on slave based on the label provided for a dynamic approach.

3. Create a job chain of job1 & job2 using the build pipeline plugin in Jenkins

4. Job1: Pull the Github repo automatically when some developers push the repo to Github and perform the following operations as:

  • 1. Create the new image dynamically for the application and copy the application code into that corresponding docker image
  • 2. Push that image to the docker hub (Public repository) ( Github code contain the application code and Dockerfile to create a new image )

5. Job2 ( Should be run on the dynamic slave of Jenkins configured with Kubernetes kubectl command): Launch the application on the top of Kubernetes cluster performing following operations:

  • 1. If launching the first time then create a deployment of the pod using the image created in the previous job. Else if deployment already exists then do a rollout of the existing pod making zero downtime for the user.
  • 2. If Application created the first time, then Expose the application. Else don’t expose it.

So Let us start our task now .

Step 1: Creating Container image that has kubectl command and some other basic configuration done .

I am also uploaded this image to docker hub see here

No alt text provided for this image

We also have to make some changes in config file of kubectl .

No alt text provided for this image
No alt text provided for this image

We are also copying the yaml file using which we will create the deployment in kubernetes .

No alt text provided for this image

Step 2: We need to configure Docker Service from the localhost to work as a Client and not as a Server. We need to set this to use docker remotely. i used .8888 port ,To configure edit the file /usr/lib/systemd/system/docker.service as follows:

No alt text provided for this image

After making changes to this file we need to restart the daemon , and also need to restart the docker . Use the below commands to do this .

systemctl daemon-reload

systemctl restart docker

export DOCKER_HOST=<localhost ip_address>:8888  

Step3: Now we have to configure the cloud in Jenkins

No alt text provided for this image
No alt text provided for this image
No alt text provided for this image
No alt text provided for this image

While doing this we have to also give the docker account creditial in registry authentication part .

So here our environment is ready for work, we have to create the jenkins jobs now .

Step 4: Building and pushing the image to Docker registry

I have put the Dockerfile in my github repo , and we will create the jenkins job which will download the github repo automatically whenever something got changed in github repo. and after downloading the github this job will also create the new image using Dockerfile present in repo and will also push to docker registry .

No alt text provided for this image
No alt text provided for this image
No alt text provided for this image
No alt text provided for this image
No alt text provided for this image

Step 5: Now we have to just make the deployment.

No alt text provided for this image
No alt text provided for this image
No alt text provided for this image
No alt text provided for this image

Viola Deployment is created .

We can also see the deployment from windows

No alt text provided for this image

I also created a pipeline view to see the live jobs running .

No alt text provided for this image

Let us see our webpage

No alt text provided for this image

In future if the load over site increases we can also set loadbalancer which is inbuilt provided by kubernetes service , we can also add more nodes to distribute load over the nodes . We can also deploy dynamic websites .

I Have done this task with my friend Atharva Patil .

Big thanks to Vimal Daga Sir for this amazing knowledge .

Thank you !!!

Prasad Joshi

6 x Salesforce Certified Developer | 2 x Flosum Certified | 1 x Copado Certified Developer

4 年

Well done shyam sulbhewar

Mitesh Dhruv

Devops Engineer at Think Future technologies

4 年

Great job

Aaditya Joshi

Software Engineer - Cloud Engineer at CRISIL Limited

4 年

Excellent work dude ????

Salik Sayyed

SE @TCS Digital | AWS Cloud & DevOps Engineer | Also skilled in NodeJS / MERN Full Stack and Python automation

4 年

Great work shyam sulbhewar

Chirag Deepak Ratvekar

Mainframe Developer at Wipro | Microsoft Certified Azure Administrator Associate | Cloud & Data Enthusiast.

4 年

Nice ??

要查看或添加评论,请登录

Shyam Sulbhewar的更多文章

  • Website Deployment Using Jenkins via Job Creation Using Groovy code

    Website Deployment Using Jenkins via Job Creation Using Groovy code

    Jenkins is an incredibly powerful tool for DevOps that allows us to execute clean and consistent builds and deployments…

    6 条评论
  • Metrics Collection and Monitoring Using Prometheus and Grafana

    Metrics Collection and Monitoring Using Prometheus and Grafana

    What is Metrics ? Metrics represent the raw measurements of resource usage or behavior that can be observed and…

    8 条评论
  • Launching Website on top of KUBERNETES

    Launching Website on top of KUBERNETES

    What is Kubernetes ? Kubernetes is a portable, extensible, open-source platform for managing containerized workloads…

    2 条评论
  • Launching A Secure Wordpress site

    Launching A Secure Wordpress site

    Task Description : Write an Infrastructure as code using Terraform, which automatically creates a VPC. In that VPC we…

    9 条评论
  • Launching Website Using AWS with EFS using Terraform

    Launching Website Using AWS with EFS using Terraform

    What is AWS ? Amazon web service is a platform that offers flexible, reliable, scalable, easy-to-use and cost-effective…

    8 条评论
  • Face Recognition Using Transfer Learning (VGG16)

    Face Recognition Using Transfer Learning (VGG16)

    Face Recognition is a method to identify the identity of an individual using their face. It is capable of identifying a…

    8 条评论
  • Amazon EKS

    Amazon EKS

    Cloud computing is an internet-based computing service in which large groups of remote servers are networked to allow…

    6 条评论
  • MLOPS & DEVOPS TASK2

    MLOPS & DEVOPS TASK2

    1. Create container image that’s has Jenkins installed using dockerfile 2.

    4 条评论
  • MLOPS & DEVOPS Task1

    MLOPS & DEVOPS Task1

    Integration of git , github ,jenkins ,docker . Task Description : JOB#1 If Developer push to master branch then Jenkins…

    2 条评论
  • Integrating aws with Terraform

    Integrating aws with Terraform

    Task 1 : How to create/launch Application using Terraform on the top of aws. 1.

社区洞察

其他会员也浏览了