AUTOMATING AWS USING TERRAFORM

AUTOMATING AWS USING TERRAFORM

AWS is a powerful cloud platform and automating it with the help of Terraform is yet more powerful.

I have created a complete infrastructure of automatically launching a website running on the AWS cloud using Terraform which is a tool used for building, changing, and versioning infrastructure safely and efficiently.

Inorder to use AWS Services in Terraform we first need to download the plugins associated with it which can be done by terraform init command. While the configuration file holds the resources as following.

No alt text provided for this image

Now we can start creating our infrastructure.

1. Create the key - Inorder to get inside our instance we need to authenticate, hence we create a key and save it in our system for future use in pem format.

No alt text provided for this image
No alt text provided for this image

The key has been successfully created and is downloaded in the folder where the code is running.

No alt text provided for this image

2. Create a security group which allows the port 80 for http services and port 22 for ssh services.

No alt text provided for this image
No alt text provided for this image

3. Launch EC2 instance - Now we will launch an EC2- instance with the created keypair and security-groups.

No alt text provided for this image

The instance it running successfully.

No alt text provided for this image


Next we use a null resource to connect to the launched instance and then use a remote provisioner to download the required softwares for launching our website i.e. HTTPD and GIT.

No alt text provided for this image

Terraform downloading the softwares.

No alt text provided for this image

Let's check if the softwares are successfully installed in the instance or not ,using systemctl status httpd and yum install git .

No alt text provided for this image

We also save the ip -address of the launched instance in a file and print it using the output block.

No alt text provided for this image

4. Launch one Volume (EBS) and mount that volume into /var/www/html folder inside the remote O.S. Then download the code from the Github repository and save it inside the /var/www/html folder .

No alt text provided for this image
No alt text provided for this image

The EBS Volume is successfully attached can be checked using fdisk -l command of Linux.

No alt text provided for this image

Also the contents of the Github repository have been successfully copied to the /var/www/html folder using the ls command inside the folder.

No alt text provided for this image

The same contents are present in the folder as are uploaded in the Github repo.

No alt text provided for this image

5. Create a S3 bucket, and copy/deploy the images into the S3 bucket and change the permission to public readable.

No alt text provided for this image

The image is uploaded in the S3 bucket and has public access.

No alt text provided for this image

9. Create a Cloudfront Distribution using S3 bucket (which contains images) and use the Cloudfront URL in the html code of the website.

No alt text provided for this image
No alt text provided for this image
No alt text provided for this image

Then we use the local provisioner to run our website on chrome by getting the ip-address of the launched instance.

No alt text provided for this image


After running all these commands successfully in Terraform , it displays the ip address.

No alt text provided for this image

And launches the website on chrome.

No alt text provided for this image

Get the entire code on Github.



要查看或添加评论,请登录

Mansi Gautam的更多文章

社区洞察

其他会员也浏览了