Task 1 - Hybrid Multi-Cloud

Task 1 - Hybrid Multi-Cloud

This task is to give by mr.vimal Daga sir. In the task, I am creating multiple resources of AWS service Like EC2, EBS, S3, CloudFront, private key-pair,security-group with the help of Terraform code.

Prerequisite to run and create the AWS service with terraform code.

1.AWS ACCOUNT.

2.OS.

3. Configure AWS in OS.

4. Download and Configure Terraform.

Know you see how to create a Terraform code for using AWS Services.

step1: I create a private key pair in a specific AWS account. I saw it with my screenshot or code.

// Specific AWS Account

provider "aws" {

 region         = "ap-south-1"

 profile         = "Priyanshu"

}

// private key_pair created

resource "tls_private_key" "task1_key" {

algorithm = "RSA"

}

module "key_pair" {


source = "terraform-aws-modules/key-pair/aws"

key_name = "task1"


public_key = tls_private_key.task1_key.public_key_openssh


}

step 2:create a security group with Terraform code.

//security group created


resource "aws_security_group" "task1_security_group" {

 name    = "task1_work"

 description = "Project webserver"

 vpc_id   = "vpc-48e1fd20"


 ingress {

  description = "TLS from VPC"

  from_port  = 80

  to_port   = 80

  protocol  = "tcp"

  cidr_blocks = ["0.0.0.0/0"]

  }


 ingress {

  description = "TLS from VPC"

  from_port  = 22

  to_port   = 22

  protocol  = "tcp"

  cidr_blocks = ["0.0.0.0/0"]

 }


 egress {

  from_port  = 0

  to_port   = 0

  protocol  = "-1"

  cidr_blocks = ["0.0.0.0/0"]

 }

 tags = {

  Name = "task1_security_group"

 }

}

step 3. create an AWS instance and install many software like httpd,git,php with Terraform code.

resource "aws_instance" "task1_Priyanshu" {

 ami      = "ami-005956c5f0f757d37"

 instance_type = "t2.micro"

 key_name = module.key_pair.this_key_pair_key_name

 security_groups = ["task1_work"]

 connection {

  type   = "ssh"

  user   = "ec2-user"

  private_key = tls_private_key.task1_key.private_key_pem

  host   = aws_instance.task1_Priyanshu.public_ip

 }


 provisioner "remote-exec" {

  inline = [

   "sudo yum install httpd php git -y",

   "sudo service httpd start",

   "sudo chkconfig httpd on ",

  ]

 }

tags = {

 Name = "task1_Os1"

 }

}

step 4: create a ebs AWS Service with Terraform code.

resource "aws_ebs_volume" "task1_volume" {

 availability_zone = aws_instance.task1_Priyanshu.availability_zone

 size       = 1

 tags = {

  Name = "task1-ebs_volume"

 }

}

step 5: Attach the EBS with EC2 instance and format the EBS or mount with /var/www/html and download the code from git and copy to var/www/html after mounting the EBS then download the code from Github otherwise your all data will be removed from the /var/www/html folder with the help of Terraform code.

//ebs attach with instance 


resource "aws_volume_attachment" "task1_ebs_att" {

depends_on=[aws_ebs_volume.task1_volume,aws_instance.task1_Priyanshu]

 device_name = "/dev/sdh"

 volume_id  = aws_ebs_volume.task1_volume.id

 instance_id = aws_instance.task1_Priyanshu.id

 force_detach =true

connection {

   type   = "ssh"

  user   = "ec2-user"

  private_key = tls_private_key.task1_key.private_key_pem

  host   = aws_instance.task1_Priya.public_ip

 }


provisioner "remote-exec" {

  inline = [

   "sudo mkfs.ext4 /dev/xvdh",

   "sudo mount /dev/xvdh /var/www/html",

   "sudo rm -rf /var/www/html/*",

   "sudo git clone https://github.com/Priyanshu38279/task1_hybdrid_multi_cloud.git /var/www/html/"

  ]

 }

 }

step 6: create an s3 bucket with the help of Terraform code.

//create s3 bucket

resource "aws_s3_bucket" "task1cloudbucket1"{

 bucket = "task1bucket1"

 acl  = "public-read"

 tags = {

  Name    = "my-bucket"

 }

}

step 7: Attach the data with an s3 bucket with Terraform code.

//attach the data in s3 bucket


resource "aws_s3_bucket_object" "task1"{

 bucket = aws_s3_bucket.task1cloudbucket1.bucket

 key  = "a.png"

 source = "C://Users//Priyanshu//Desktop/terraform.jpg"

  

 acl  = "public-read"

}

step 8: create a Cloud Front and attach the s3 bucket for providing the content Delivery Network (CDN) service with the help of Terraform code.

//create AWS CloudFront service

resource "aws_cloudfront_distribution" "s3_task_distribution" {

 origin {

  domain_name = aws_s3_bucket.task1cloudbucket1.bucket_regional_domain_name

  origin_id  = aws_s3_bucket.task1cloudbucket1.id

 }



 enabled       = true

 is_ipv6_enabled   = true

 comment       = "mytaskcloudfront"



 default_cache_behavior {

  allowed_methods = ["DELETE", "GET", "HEAD", "OPTIONS", "PATCH", "POST", "PUT"]

  cached_methods  = ["GET", "HEAD"]

  target_origin_id = aws_s3_bucket.task1cloudbucket1.id

forwarded_values {

   query_string = false

   cookies {

    forward = "none"

   }

  }

  viewer_protocol_policy = "allow-all"

 }

 price_class = "PriceClass_200"

 restrictions {

  geo_restriction {

   restriction_type = "whitelist"

   locations    = ["US", "CA", "IN"]

  }

 }

viewer_certificate {

  cloudfront_default_certificate = true

 }

step 9: know automatically run the site and attach the CloudFront DNS provide in the code and that which is available in CloudFront that automatically place in the code

 

connection {

  type   = "ssh"

  user   = "ec2-user"

  private_key = tls_private_key.task1_key.private_key_pem

  host   = aws_instance.task1_Priyanshu.public_ip

 }

 provisioner "remote-exec" {

    inline = [

  # "sudo su << \"EOF\" \n echo \"<img src='${self.domain_name}'>\" >> /var/www/html/index.html \n \"EOF\""

      "sudo su << EOF",

      "echo \"<img src='https://${self.domain_name}/${aws_s3_bucket_object.task1.key}'>\" >> /var/www/html/index.html",

      "EOF"

       ]

  }

}

step 10: know automatically run the code in the google chrome with the help of Terraform Code.

 provisioner "remote-exec" {

    inline = [

  # "sudo su << \"EOF\" \n echo \"<img src='${self.domain_name}'>\" >> /var/www/html/index.html \n \"EOF\""

      "sudo su << EOF",

      "echo \"<img src='https://${self.domain_name}/${aws_s3_bucket_object.task1.key}'>\" >> /var/www/html/index.html",

      "EOF"

       ]

  }

}

the final output of the code.

with the single command of Terraform, you create and destroy the code, first of all, you download the plugins which service you want to use in the terraform code like AWS, OpenStack, etc. means which provider service you want to use in the terraform code.

terraform init //that command is used for download the plugin

terraform apply -auto-approve // that command is used to run the terraform behind the seen Providers services create.

terraform destroy // that command is used to destroy all service which is created with the help of Terraform code on a single go.

Thank you.

If you have any queries please contact me.

要查看或添加评论,请登录

Priyanshu Pathak的更多文章

  • Integration of Jenkins, Docker, Git and httpd

    Integration of Jenkins, Docker, Git and httpd

    1. Create container image that’s has Jenkins installed using dockerfile.

  • Task -1 DevOps

    Task -1 DevOps

    MY TASK OVERVIEW JOB-1 If Developer pushes to dev branch then Jenkins will fetch from dev and deploy on the dev-docker…

  • Task -6 Hybrid Multi-Cloud

    Task -6 Hybrid Multi-Cloud

    TASK 1. Write an Infrastructure as code using Terraform, which automatically deploy the WordPress application 2.

  • Task 4 Hybrid Multi-Cloud

    Task 4 Hybrid Multi-Cloud

    In this article, we are going to host a WordPress application on AWS in a secure manner. A WordPress application…

  • Task 3

    Task 3

    In this article, we are going to host a WordPress application on AWS in a secure manner. A WordPress application…

  • Task-2 Hybrid Multi-Cloud

    Task-2 Hybrid Multi-Cloud

    In this article, we are going to see how we can host an apache webserver using AWS EC2 and EFS services. Our site will…

社区洞察

其他会员也浏览了