JENKINS ON DOCKER

JENKINS ON DOCKER

In this article we will see how to make our own Jenkins Docker image so that on one click, Jenkins will be loaded. We will also see how to automatically open language suitable interpeter.

First create the Dockerfile.

To create the Dockerfile make a new directory using the mkdir command.

Using gedit Dockerfile command open the editor and type the code given below.

No alt text provided for this image

After this save the file and type the command:

 docker build -t jenkinos:v1 

This will build a image after completion.

docker run -it --name c4 --privileged -p 1235:8080 -v /:/host jenkinos:v1

After launching the container successfully we can see something like this.

No alt text provided for this image

After this open the browser and type the IP address with specified port number. Here I have used 1236.

No alt text provided for this image

After this follow the normal steps to setup the Jenkins.

Now I have used slave master system to run the program.

Go to manage node and click on new node and do following configuration.

Host IP will be our slave system IP and remember the label because all the jobs will run on that label.

No alt text provided for this image
No alt text provided for this image
No alt text provided for this image

In the above image, HOST IP has to be changed and credential has to be added accordingly.

No alt text provided for this image

After this click on Save. After some time we can see something like this.

No alt text provided for this image

After the slave is setup and it is online go to Jenkins home and create job1.

job1 will pull the data from git repository and copy it to the folder which is mounted on the docker and it will also check extension of the file and according to that, it will launch the docker image.

No alt text provided for this image
No alt text provided for this image
No alt text provided for this image
No alt text provided for this image
No alt text provided for this image

Here, I have used port 9993.

job2 will check whether the site is working or not.

No alt text provided for this image
No alt text provided for this image
No alt text provided for this image

job2 will trigger the job3 irrespective of the build.

In job3, if the container is working good then it will do nothing but if the container is not running as required it will send an Email to the developer and again it will trigger job1 to launch the container.

No alt text provided for this image
No alt text provided for this image
No alt text provided for this image

The mail.py code can be found on my github repository as attached below.

Conclusion

We have setup a jenkins with the help of docker image and setup a master slave system to run the docker image according to language interpreter. If the container fails to start, the mail is sent to the developer and and it will try to relaunch it again by itself.

要查看或添加评论,请登录

Nikhil G R的更多文章

  • Introduction to DBT (Data Build Tool)

    Introduction to DBT (Data Build Tool)

    dbt is an open-source command-line tool that enables data engineers and analysts to transform data in their warehouse…

  • DIFFERENCES IN SQL

    DIFFERENCES IN SQL

    WHERE vs HAVING WHERE and HAVING clauses are both used in SQL to filter data. WHERE WHERE clause should be used before…

  • Introduction to Azure Databricks (Part 2)

    Introduction to Azure Databricks (Part 2)

    DBFS (Databricks File System) It is a Distributed File System. It is mounted into a databricks workspace.

  • Introduction to Azure Databricks (Part 1)

    Introduction to Azure Databricks (Part 1)

    Databricks is a company created by the creators of Apache Spark. It is an Apache Spark based unified analytics platform…

  • Aggregate and Window Functions in Pyspark

    Aggregate and Window Functions in Pyspark

    Aggregate Functions These are the functions where the number of output rows will always be less than the number of…

  • Different ways of creating a Dataframe in Pyspark

    Different ways of creating a Dataframe in Pyspark

    Using spark.read Using spark.

  • Dataframes and Spark SQL Table

    Dataframes and Spark SQL Table

    Dataframes These are in the form of RDDs with some structure/schema which is not persistent as it is available only in…

  • Dataframe Reader API

    Dataframe Reader API

    We can read the different format of files using the Dataframe Reader API. Standard way to create a Dataframe Instead of…

  • repartition vs coalesce in pyspark

    repartition vs coalesce in pyspark

    repartition There can be a case if we need to increase or decrease partitions to get more parallesism. repartition can…

    2 条评论
  • Apache Spark on YARN Architecture

    Apache Spark on YARN Architecture

    Before going through the Spark architecture, let us understand the Hadoop ecosystem. The core components of Hadoop are…

社区洞察

其他会员也浏览了