Project #4?-?Automating EC2, S3, RDS, and VPC with AWS?Lambda

Project #4?-?Automating EC2, S3, RDS, and VPC with AWS?Lambda

In this project, we will be using AWS Lambda to automate various tasks within EC2, S3, RD, and VPC. Utilizing automation with Lambda helps us improve efficiency and scalability, while reducing errors and cutting costs. We will utilize the following AWS Services:

  • AWS Lambda
  • Amazon EventBridge
  • Amazon RDS
  • Amazon VPC
  • Amazon EC2
  • Amazon S3
  • AWS CloudShell

Lets dive in!

Part 1?—?Automating EC2 Backups with Lambda?

In this section, we will create a Lambda function that creates a snapshot of our EC2 instance at regular intervals.

  1. First, we will write out our script for automating the ec2 snapshots from Lambda. Here is a link to the script I wrote with comments to describe each section of the code: https://github.com/zbliss5000/LambdaAutomation/blob/main/ec2lambdasnapshot.py
  2. Next, we will need to give our Lambda function an IAM role that has access to ec2, so we are able to create a snapshot. I created a role with EC2FullAccess permissions and the use case as Lambda, then attached this role to the function under the “permissions tab” in Lambda.
  3. Now it is time to test the function utilizing the testing environment in Lambda. The first test failed, because we only had the timeout set to 3 seconds and the snapshots take slightly longer to create. Once we modified the timeout on the function to 20 seconds, it was successful.
  4. Finally, we will create our scheduled event in EventBridge. We will go in and create a new schedule, and set it as a recurring schedule, with a 24 hour rate expression. We will select the Lambda function we created in step 1, and then create the schedule. After a few minutes, I went back in to check the schedule to confirm it was ran properly, and it was successful.

Part 2— Automating Data Validation for S3 with Lambda?

In this section, we will be creating a Lambda function that will validate files uploaded into an S3 bucket for any discrepancies. If any are found, we will automatically move the file to an error bucket.

  1. First we will write out our script for our Lambda function that will automate the data validation process. Here is a link to the script I wrote with comments to describe each section of the code: https://github.com/zbliss5000/LambdaAutomation/blob/main/s3datavalidationfunction.py
  2. Next, I will go in and create two S3 buckets, one for uploading the initial files into, and the other for handling the removed files due to errors.?
  3. After the buckets are created, I will go into my Lambda function and set the trigger for the function as our initial S3 bucket, and allow permissions to the bucket.
  4. Then we will want to make sure our Lambda function has the proper permissions, so I will create a role with full access to s3, and attach it to our Lambda function.
  5. Finally, we will run a test to confirm the function is working properly. We uploaded a test file that had an error in it, and then confirmed that it was transferred properly over to the error bucket.

Here is a video recording of the walkthrough for Parts 1 and 2: https://www.loom.com/share/d47175f2fe854f08b34bd8f53e293a8f

Part 3?—?Automating Data Conversion for RDS with Lambda?

In this section, we will be creating a Lambda function that is triggered whenever a new billing cvs file is uploaded to an s3 bucket. The function will read the file, convert any billing amount not in USD into USD, and then insert the records into an Aurora Serverless v1 database.

  1. First we will write out our script for our Lambda function that will automate the data conversion process. Here is a link to the script I wrote with comments to describe each section of the code: https://github.com/zbliss5000/LambdaAutomation/blob/main/rdsproject.py
  2. Next, we will need to create an Aurora Serverless Database cluster, and an s3 bucket to use for this phase.?
  3. After these are both created, we will head over to our Lambda function. We need to add permissions for Amazon S3, Amazon RDS, and AWS Secrets Manager, so that the function can access the file from the S3 bucket, log into the Aurora Database using the secret from Secrets Manager, and then upload the columns from the csv file in the S3 bucket to our Aurora table.
  4. Finally, we will go into our Aurora database, and create a table titled “billing_details”, which will store our data from our csv.
  5. To test our function, we will upload a csv file we created with some fictional billing data, and confirm that it has been uploaded over to our Aurora database table. In the query editor, we will log into our database, and use the command “select * from billing_details”, and this shows that our data has been successfully uploaded to the table.

Part 4?—?Automating VPC operations with Lambda?

In this section, we will be creating a Lambda function that will check for unassociated Elastic IP addresses in our VPC, and will release them to save costs. Our function will be triggered by EventBridge.?

  1. Our first step will be the launch an EC2 instance (all default settings are fine), and then create allocate 3 Elastic IPs from the EC2 dashboard. We will be using these later in this section.?
  2. Then we will write our script for our function. Here is a link to the script I wrote with comments to describe each section of the code: https://github.com/zbliss5000/LambdaAutomation/blob/main/manageeips.py
  3. Once we have our Elastic IPs, we will select one at random and associate it with our EC2 instance.
  4. After we associated on of the EIPs with our instance, we will head over to our Lambda function, and add permissions for full EC2 access so the function is able to remove the unassociated EIPs.?
  5. Next, we will go under “general configuration” for our function, and add a trigger that is set to “EventBridge”, that has a schedule expression of 1 day, so that our function will run daily.
  6. Now we will test our function. It should be triggered immediately when we add our EventBridge rule, but we can also test using the “Test” option in our Lambda console. After running the test, we go over to the EC2 console, and see that our EIPs have in fact been removed.

Here is a video recording breaking of the walkthrough for Parts 3 and 4:

https://www.loom.com/share/63a1defd15e74a8eacd5d9b744e34eb2

And that is it! Our project on automating various AWS services with Lambda is complete. Here is a link to my personal website I created on AWS if you would like to see all of the projects I am currently working on:

https://www.zackawslabs.com/

要查看或添加评论,请登录

Zack Bliss的更多文章

社区洞察

其他会员也浏览了