AWS CI/CD Project
Umer Asghar
Certified Linux Sys Admin | AWS Certified Solution Architect | HashiCorp Terraform Associate | Azure DevOps | System Engineer, Cloud Apps
In this project, we will integrate various AWS services to create a complete CI/CD pipeline, leveraging AWS's power and flexibility. Uniquely, this project relies solely on AWS managed services, eliminating the need for traditional tools like Git, Jenkins, or EC2 instances. Here's how we achieve this:
we leverage AWS managed services to build a seamless CI/CD pipeline. We store our source code in AWS CodeCommit, replacing GitHub, and use AWS CodeBuild instead of Jenkins to compile code, run tests, and generate deployable artifacts. These artifacts are securely stored in S3. AWS CodeDeploy then automates the deployment to Elastic Beanstalk, where our Tomcat application runs, connected to an RDS MySQL database. AWS CodePipeline orchestrates the entire process, ensuring that any code changes trigger a fully automated build, test, and deployment cycle, maintaining our application's smooth operation and scalability.
Setting Up the AWS Environment
AWS Beanstalk
AWS Elastic Beanstalk is a fully managed service that makes it easy to deploy, manage, and scale applications and services. It supports various programming languages and platforms, such as Java, .NET, PHP, Node.js, Python, Ruby, Go, and Docker. Elastic Beanstalk handles the provisioning of infrastructure, load balancing, scaling, and monitoring, allowing developers to focus on writing code. Simply upload your code, and Elastic Beanstalk automatically manages the deployment, from capacity provisioning to health monitoring.
Let's create one. The settings I have selected are based on my requirement and they differ as per need.
In this session, we created an Elastic Beanstalk application named "vprofile" and set up a web server environment named "vprofileprod" running on a Tomcat platform. Key configurations included selecting a sample application for deployment, setting up an SSH key pair (vpro-bean-key), and choosing the default VPC with public IPs for EC2 instances. We configured auto-scaling with a minimum of two instances and a maximum of eight, opted for a t3.micro instance type for cost efficiency, and chose an application load balancer. We also prepared the environment for potential future variable settings but did not add any. After the setup, we confirmed the successful creation of the environment, verifying the healthy state of EC2 instances, security groups, auto-scaling group, and the application load balancer.
AWS RDS
In this section, I detailed the process of setting up an RDS instance and configuring it for seamless integration with an AWS Elastic Beanstalk application. First, I created an RDS instance using the MySQL engine version 5.7, opting for a free tier T3.micro instance, and named it "vprofile-rds-prod." The database, titled "accounts," was configured with port 3306 for MySQL access. I then established a security group, "vprofile-rds-prod-sg," to permit inbound traffic on port 3306 from the Beanstalk instance's security group, ensuring proper connectivity.
To validate this setup, I SSHed into a Beanstalk instance using the key pair "vpro-bean-key," installed the MySQL client, and confirmed connectivity to the RDS instance by deploying the necessary SQL scripts. Additionally, I adjusted the target group's health check path to "/login" to align with the application's requirements and enabled session stickiness, ensuring consistent user sessions across instances. These meticulous steps were crucial to ensuring that the application could reliably connect to the RDS instance and maintain health during deployment phases.
AWS Code Commit
After setting up the Beanstalk and RDS instances, the next crucial step was to establish a CI/CD pipeline. I utilized AWS CodeCommit as our version control service, an excellent alternative to GitHub, benefiting from AWS security and seamless integration with other AWS services. The transition involved creating a CodeCommit repository, migrating all commit history from GitHub to CodeCommit. This ensures a consistent version control environment.
Using the AWS Management Console, I navigated to the CodeCommit service, created a repository named "VProfile," and enabled optional Amazon CodeGuru for enhanced code quality through automated analysis. For secure repository access, I opted for SSH over HTTPS to avoid the risk of password exposure. This setup forms the foundation for a robust CI/CD pipeline, leveraging AWS's integrated services for streamlined application development and deployment.
To integrate AWS CodeCommit and create a user with specific access, follow these steps:
Create IAM User
Create Policy
Configure SSH for AWS CodeCommit
领英推荐
Create SSH Config File
Host git-codecommit.*.amazonaws.com
User <IAM-User-ID>
IdentityFile ~/.ssh/codecommit_RSA
Clone Repository
git clone ssh://git-codecommit.<region>.amazonaws.com/v1/repos/vprofile
Validate user privileges by checking if the clone is successful
Migrate from GitHub to CodeCommit
git remote rm origin
git remote add origin ssh://git-codecommit.<region>.amazonaws.com/v1/repos/vprofile
git push --all origin
git push --tags
By following these steps, you can securely transition your version control system to AWS CodeCommit, setting up necessary user permissions and ensuring a smooth integration with your CI/CD pipeline.
AWS Code Build
the project, select the source repository from AWS CodeCommit, and choose the appropriate branch.
Configure the build environment by selecting a managed Docker image, such as Ubuntu, which includes necessary runtimes like Java and Maven. The build process is defined in a build specification (buildspec) file written in YAML format. This file outlines the steps CodeBuild will take: installing dependencies, preparing the environment, running the build commands, and handling post-build tasks.
The buildspec file specifies phases like install, pre-build, build, and post-build. For example, in the pre-build phase, you might replace database credentials in configuration files. The build phase typically runs commands like mvn install to compile the code. After the build, artifacts are defined and uploaded to an S3 bucket for storage.
Set up CloudWatch Logs to capture and monitor the build process, which helps in troubleshooting any issues. If you encounter errors, such as conflicts with existing roles or policies, resolve them by renaming resources or deleting conflicting ones.
After configuring everything, create the build project and run it to verify that the build process works as expected and produces the necessary artifacts. This setup ensures an automated, efficient build process as part of your CI/CD pipeline.
AWS CI/CD & Code Deploy
We conduct a crucial test on our build job to ensure its functionality before proceeding with integrating all components into a seamless CodePipeline. This test serves as a vital checkpoint to verify that our build job effectively generates the artifact as expected. During this process, we closely monitor the progress of the build job through phase details, allowing us to track each step's execution and identify any potential issues promptly. Additionally, we utilize CloudWatch logs to gain deeper insights into the build process and its outcomes, ensuring thorough monitoring and troubleshooting capabilities.
Once we confirm the successful completion of the build job, we initiate the creation of a CodePipeline, a pivotal component that orchestrates the entire CI/CD workflow. The CodePipeline serves as a comprehensive framework for connecting various stages of our development pipeline, including source code management, build processes, and deployment actions. Within the pipeline configuration, we leverage advanced options such as encryption to bolster security measures and manual approvals to exercise precise control over deployment stages.
As we configure the CodePipeline, we have the flexibility to incorporate additional actions to enhance our workflow. For instance, we can seamlessly integrate unit testing by adding dedicated testing stages within the pipeline flow. Leveraging the capabilities of AWS CodeBuild, we streamline these testing procedures by executing custom commands directly within the build environment. This approach empowers us to conduct comprehensive testing within the CI/CD pipeline, ensuring the reliability and quality of our software releases.
Furthermore, we optimize the efficiency of our pipeline by leveraging CodeBuild's command execution capabilities, which enable us to perform various tasks directly within the build environment. This streamlined approach simplifies complex operations and facilitates seamless integration of additional functionalities, contributing to the overall robustness of our CI/CD workflow.
In conclusion, the integration of our build job into a comprehensive CodePipeline marks a significant milestone in our project, streamlining the CI/CD workflow and enhancing the efficiency of our development processes. Through careful configuration and optimization, we establish a robust foundation for continuous integration and delivery, empowering our team to deliver high-quality software releases with confidence and agility.