Azure Repository integration with AWS CodePipeline
Authors: Sangita & Saravanan Mani

Azure Repository integration with AWS CodePipeline

This article demonstrates the process of integrating Azure DevOps Repository into the AWS CodePipeline and deploying the application on the ECS cluster. It gives a walkthrough of configuring the following:

  1. Triggering the AWS CodePipeline execution from the Azure Repository using Webhook,
  2. Configuring pipeline with custom source action
  3. Deploying application to ECS cluster

Application Overview:

A Sample Nginx container code is hosted on the Azure Repository. Our aim is to deploy this container on the Amazon ECS Cluster.

AWS Services and other components used in this solution:

  1. Code Repository: Custom source repository (Azure repository)
  2. Lambda: Lambda function associated with custom source
  3. Code Build: Code Build state for building the application code and docker image
  4. Code Deploy: Code Deploy stage for deploying the docker image to the ECS cluster
  5. Secret Manager: Secrets Manager to securely store the SSH keys of the respective Azure repo
  6. S3 bucket: S3 bucket to store the application code cloned from the Azure repo
  7. ECS Cluster: ECS with 3 worker nodes
  8. ECR: Private ECR repository
  9. CloudWatch: CloudWatch to monitor the CodePipeline

Architecture

Understanding the architecture

  1. A developer commits a code change to the Azure repository.
  2. Azure repository calls CodePipeline webhook
  3. AWS Code Pipeline executes the 1st stage (Source Stage which is a Custom Source Stage)
  4. The Custom Source Stage action initiates a CloudWatch Event Rule
  5. A CloudWatch Event Rule initiates a Lambda function
  6. The Lambda function connects to the Azure Repo using the SSH key from the AWS Secret Manager
  7. The lambda function executes the Git pull on the respective branch
  8. The custom source stage will copy the source code to the S3 bucket as a Build artifact
  9. Code Pipeline executes the 2nd Stage, which gets the input artifacts from the S3 bucket Builds the docker image, and pushes the docker Image to the ECR Repository
  10. Code Pipeline executes the final stage, Code Deploy, and deploys the application into the ECS cluster

Pre-requisites:

  1. Create VPC with private and public subnets -2 each
  2. Create an Internet Gateway and Route Table
  3. Create a NAT Gateway
  4. Create a bastion host with AWS CLI Installed and configured
  5. Create 1x S3 private bucket
  6. Create an ECS cluster with 3 worker nodes along with an Application Load balancer and Cloud Watch
  7. Build the docker image and push it to ECR

?

Let’s get started!

1. Configuring Azure Git Repository

1.1??? Create SSH Keys

-Login to Bastion host or any Linux host. Run the following command and follow the onscreen instruction to complete the procedure:

ssh-keygen -t rsa -b 4096 -C “<your email address>”        

-Navigate to the SSH directory and copy the content of <filename.pub> (use the filename provided during the key generation)

cd ~/.ssh
ls
authorized_keys config xxxx.pub known_host        


1.2??? Add SSH key in Azure Repository

-????????? Login to Azure DevOps console and navigate to ‘User settings’ > create a new key by providing the SSH public key generated in the previous step, and click on ‘Add’

2.Configuring AWS Code Pipeline with 3rd party Source Stage

2.1.Publishing SSH key to AWS Secret Manager

-Login to the bastion host and execute the following command. It will create a Secret Value in Secrets Manager and set the respective ARN in an environment variable.

export SecretsManagerArn=$(aws secretsmanager create-secret --name <name of secrets> --secret-string file:///home/ec2-user/.ssh/xxxx - -query ARN --output text)        

Note: This ARN will be required later in the process.

?2.2.? Launching CloudFormation Stack

The CloudFormation stack will be launched with a pre-configured template. This template will create the following:

‐?CodePipeline Custom Action Type

‐Lambda Function associated with the above Custom Action

‐Code Build Project to perform the Git Clone and push the code to S3

‐Lambda Execution Role

‐Code Build Execution Role

The above resources will be used to connect to the Azure Git Repository and get the Code S3

?1. Clone the git repository aws-codepipeline-third-party-git-repositories that contains the AWS CloudFormation templates and AWS Lambda function code using the below command:

git clone https://github.com/aws-samples/aws-codepipeline-third-party-git-repositories.git .        

2. Upload the Lambda function code to an S3 bucket in the same Region where the stack is being deployed

?3. To create a new S3 bucket, use the following code (make sure to provide a unique name):

export ACCOUNT_ID=$(aws sts get-caller-identity --query Account --output text)
export S3_BUCKET_NAME=codepipeline-git-custom-action-${ACCOUNT_ID}
aws s3 mb s3://${S3_BUCKET_NAME} --region ap-south-1        

Note: Save the S3 bucket name. It will be required in the later steps.

4. Execute the following command from the root directory where git was cloned to upload the Lambda function to S3 bucket:

export ZIP_FILE_NAME=”codepipeline_git.zip”
zip -jr ${ZIP_FILE_NAME} ./lambda/lambda_function.py && \
aws s3 cp codepipeline_git.zip \
s3://${S3_BUCKET_NAME}/${ZIP_FILE_NAME}        

Note: Save the file name. It will be required in the later steps.

?5. To create the Custom Source Stage for the CodePipeline, run the following command with the correct details (such as VPC ID, subnet ID, bucket name etc.):

Note: This stack must be created only once. The same Custom Source Stage can be used in multiple code pipelines. There is no need for creating separate Custom Stages for each microservice.

export vpcId="vpc-xxxxxxxxxx"
export subnetId1="subnet-xxxxxxxx"
export subnetId2="subnet-xxxxxxxxx"
export GIT_SOURCE_STACK_NAME="third-party-codepipeline-git-source"
aws cloudformation create-stack \
--stack-name ${GIT_SOURCE_STACK_NAME} \
--template-body file://$(pwd)/cfn/third_party_git_custom_action.yaml \
--parameters ParameterKey=SourceActionVersion,ParameterValue=1 \
ParameterKey=SourceActionProvider,ParameterValue=CustomSourceForGit \
ParameterKey=GitPullLambdaSubnet,ParameterValue=${subnetId1}\\,${subnetId}\
ParameterKey=GitPullLambdaVpc,ParameterValue=${vpcId} \
ParameterKey=LambdaCodeS3Bucket,ParameterValue=${S3_BUCKET_NAME} \
ParameterKey=LambdaCodeS3Key,ParameterValue=${ZIP_FILE_NAME} \
--capabilities CAPABILITY_IAM        

?6. Navigate to AWS CloudFormation Console and wait for the stack to complete the provisioning.

7. Navigate to AWS Code Pipeline using the management console and verify the availability of custom stage.

3.Configuring AWS CodePipeline for deploying the sample docker container

?Pre-Requisites:

?o For Azure DevOps source IP Address range, it is required to whitelist the specific IP Range to allow invoking the Amazon CodePipeline via webhook. This is to establish connectivity from Azure DevOps Services to on-premises/ other cloud services. As per the Azure Documentation, the Source IP Address range used for connectivity in this example is 20.37.158.0/23

o? Git repository SSH_URL for the sample nginx container repository

o? Git Branch to trigger the pipeline

o? ARN of the Git SSH Private stored in Secrets Manager created in the previous section

o? A Task definition must be created using the AWS Management Console. This task definition will be updated with the latest image URI/tag and deployed as part of the Amazon CodePipeline

?3.1 Updating the CloudFormation stack

This section will make necessary changes to the CloudFormation template to accommodate the environment. (Note: It varies from case to case)

?3.1.1 Updating the Pipeline webhook resource - AWS:CodePipeline::Webhook

  1. Login to the bastion host, navigate to working directory where git cloning was done in the previous step
  2. Open the “aws-codepipeline-third-party-git-repositories/cfn/sample_pipeline_custom.yaml” in a text editor
  3. Edit as directed below:

a. Update the JsonPath as give below ?

? PipelineWebhook:
??? Type: "AWS::CodePipeline::Webhook"
??? Properties:
????? TargetPipeline: !Ref Pipeline
????? TargetPipelineVersion: 1
????? TargetAction: Source
????? Filters:
??????? #- JsonPath: '$.ref'
??????? - JsonPath: "$.resource.refUpdates..name"
????????? MatchEquals: 'refs/heads/{Branch}'
????? Authentication: IP
????? AuthenticationConfiguration:
??????? AllowedIPRange: !Ref GitWebHookIpAddress
????? RegisterWithThirdParty: false        

?3.1.2 Updating the Code Build Project resource – AWS:CodeBuild::Project

  1. Open the “cfn/sample_pipeline_custom.yaml in a text editor
  2. Update as shown below:

?? CodeBuild:
??? Type: 'AWS::CodeBuild::Project'
??? Properties:
????? Artifacts:
??????? Type: CODEPIPELINE
????? Environment:
??????? ComputeType: BUILD_GENERAL1_SMALL
??????? #Image: aws/codebuild/java:openjdk-8
??????? Image: aws/codebuild/amazonlinux2-x86_64-standard:4.0
??????? PrivilegedMode: true
??????? Type: LINUX_CONTAINER
????? ServiceRole: !Ref CodeBuildRole
????? Source:
??????? Type: CODEPIPELINE
??????? BuildSpec: |
??????? ??????version: 0.2
??????? ??????phases:
??????? ????????install:
??????? ????????????runtime-versions:
??????? ????????????????python: 3.7
??????? ????????????# commands:
??????? ????????????# - pip3 install boto3
??????? ????????build:
??????? ????????????commands:
??????? ????????????- ls -al        

Note: build spec will be provided as part of the source code?

3.1.3 Creating the Buildspec yaml

1.Create a file named “buildspec.yaml” in the source code root directory

2.Make sure the file is updated with the corresponding variables

a.ECR_URI: <mention your ECR-URI>

b.ECS_CLUSTER: <Your cluster name>

c.SERVICE_NAME: <Your service name>

d.TASK_DEFINITION_NAME: <Your task definition name>

version: 0.2
?env:
? variables:
??? CODEBUILD_BUILD_NUMBER: $CODEBUILD_BUILD_NUMBER
??? NEW_REVISION: $NEW_REVISION
??? ECR_URI: <mention your ECR-URI>
??? ECS_CLUSTER: <Your cluster name>
??? SERVICE_NAME: <Your service name>
??? TASK_DEFINITION_NAME: <Your task definition name>
? exported-variables:
??? - CODEBUILD_BUILD_NUMBER
??? - NEW_REVISION
??? - ECR_URI
??? - ECS_CLUSTER
??? - SERVICE_NAME
??? - TASK_DEFINITION_NAME
phases:
? pre_build:
??? on-failure: ABORT
??? commands:
????? - echo Logging in to Amazon ECR...
????? - $(aws ecr get-login --no-include-email --region $AWS_DEFAULT_REGION)
???? #- $(aws ecr get-login-password --region ap-south-1 | docker login --username AWS --password-stdin xxxxx.dkr.ecr.ap-south-1.amazonaws.com)
? build:
??? on-failure: ABORT
??? commands:
????? - echo Build started on date
????? - echo Building the Docker image...?
????? - echo Build Number $CODEBUILD_BUILD_NUMBER
????? - docker build -t nginx-demo:$CODEBUILD_BUILD_NUMBER .
????? - docker tag nginx-demo:$CODEBUILD_BUILD_NUMBER $ECR_URI:$CODEBUILD_BUILD_NUMBER
? post_build:
??? on-failure: ABORT
??? commands:
????? - echo Build completed on date
????? - echo Pushing the Docker image...
????? - docker push $ECR_URI:$CODEBUILD_BUILD_NUMBER
????? - echo Update ECS Task definition
????? - ECR_IMAGE=$ECR_URI:$CODEBUILD_BUILD_NUMBER
????? - TASK_DEFINITION=$(aws ecs describe-task-definition --task-definition "$TASK_DEFINITION_NAME" --region "$AWS_DEFAULT_REGION")
????? - NEW_TASK_DEFINTIION=$(echo $TASK_DEFINITION | jq --arg IMAGE "$ECR_IMAGE" '.taskDefinition | .containerDefinitions[0].image = $IMAGE | del(.taskDefinitionArn) | del(.revision) | del(.status) | del(.requiresAttributes) | del(.compatibilities) | del(.registeredBy) | del(.registeredAt)')
????? - NEW_TASK_INFO=$(aws ecs register-task-definition --region "$AWS_DEFAULT_REGION" --cli-input-json "$NEW_TASK_DEFINTIION")
????? - NEW_REVISION=$(echo $NEW_TASK_INFO | jq '.taskDefinition.revision')
????? - TASK_DEFINITION_ARN=$(echo $NEW_TASK_INFO | jq '.taskDefinition.taskDefinitionArn')
????? - echo ${NEW_TASK_DEFINTIION} > new_task_definition.json
????? - echo Writing image definitions file...
????? - printf '[{"name":" nginx-demo","imageUri":"%s"}]' $ECR_URI:$CODEBUILD_BUILD_NUMBER > imagedefinitions.json
artifacts:
? files:
??? - new_task_definition.json
??? - imagedefinitions.json        

?3.2 Deploying the CodePipeline using the CloudFormation stack for nginx-demo

1.??? Login to bastion > navigate to CF directory > execute the following command with the required inputs, to launch the CloudFormation stack for project-API:

export SSH_URL=" <Git repository SSH_URL for the project-API repository>”
export SAMPLE_STACK_NAME="third-party-codepipeline-git-source-project-api"
export GIT_BRANCH="<git-branch-name>"
export SecretsManagerArn="<Enter the ARN of ScecretManager created in the above>"
export PIPELINE_NAME="dev-ecs-project-api"
aws cloudformation create-stack \
--stack-name ${SAMPLE_STACK_NAME} \
--template-body file://$(pwd)/cfn/sample_pipeline_custom.yaml \
--parameters ParameterKey=Branch,ParameterValue=${GIT_BRANCH} \
ParameterKey=GitUrl,ParameterValue=${SSH_URL} \
ParameterKey=SourceActionVersion,ParameterValue=1 \
ParameterKey=SourceActionProvider,ParameterValue=CustomSourceForGit \
ParameterKey=CodePipelineName,ParameterValue=${PIPELINE_NAME} \
ParameterKey=SecretsManagerArnForSSHPrivateKey,ParameterValue=${SecretsManagerArn} \
ParameterKey=GitWebHookIpAddress,ParameterValue=20.37.158.0/23 \
--capabilities CAPABILITY_IAM        

2.??? Verify the stack in the CF console.

3.??? Navigate to the CodePipeline and verify that the pipeline is created.

4.??? Navigate to Build Section > Build Project and verify that the Build projects are created.

3.3 Retrieving the Webhook URL

After the Stack status changes to CREATE_COMPLETE, execute the following command on the Bastion host to get the Webhook URL

aws cloudformation describe-stacks --stack-name ${SAMPLE_STACK_NAME} --output text --query "Stacks[].Outputs[?OutputKey=='CodePipelineWebHookUrl'].OutputValue"        

Note: This URL will be used to configure the Webhook in the Azure DevOps console

?3.4 Configuring the Webhook in the Azure DevOps Console

  1. Log in to your Azure DevOps console > navigate to your project-api Repository
  2. From Settings > Service Hooks > Create subscription > select ‘Webhook’> click on ‘Next’

  1. Provide the Following inputs and click next Type of Event: Code pushed Repository: <<name of your repository>> Branch: <<your branch name>>

  1. Provide “CodePipeline Webhook URL from the CloudFormation Output” in Settings > URL section as shown in the below image and click on ‘Finish’

4. Additional configurations for CodePipeline

Additional configurations will be manually done using the AWS console for the nginx-demo CodePipeline

4.1 IAM permissions for the CodeBuild role

  1. From the Management console, navigate to CodePipeline and click on the AWS CodeBuild project

  1. Click on ‘Build Project’ Details > scroll down to ‘Service role’ in ‘Environment’ section

  1. Click on ‘Service Role’ to update the CodeBuild IAM role with the following managed policies: AmazonEC2ContainerRegistryFullAccess AmazonECS_FullAccess

Note: For development, AWS-managed policies provide higher privileges.

For Production, create custom policies with more restrictive privileges as per the Least Privilege Best Practices.

4.2 IAM permissions for CodePipeline Role

  1. Navigate to CodePipeline console, select the project-api CodePipeline > Settings > click on ‘Service Role ARN’

  1. Update the IAM Role with the following permissions using the Visual Editor > Review Policy > Save Policy

?

4.3 Configure SNS for approval notification (Optional)

  1. Navigate to SNS console > click on ‘Create topic’ > select ‘Standard’ option and enter details for the following:

‐????????? Name: Name of the topic

‐????????? Display Name: Display Name of the topic

  1. Click on ‘Subscriptions’ > enter the following details and click on ‘Create subscription’

‐?Topic: Select the Topic created in the previous section

‐?Protocol: Select Email

‐?Endpoint: Enter the email ID/ Alias address to receive approval notifications

  1. SNS will send a verification email to the provided email address to confirm the subscription

?4.4 Adding manual approval stage (Optional)

  1. Navigate to the CodePipeline console > click on Project-API pipeline > click on ‘Edit’ present on top-right corner

  1. Click on ‘Add Stage’ next to the CodeBuild Stage

  1. Provide the Stage Name as ‘Manual Approval’ and click on ‘Add Stage’
  2. Click on ‘+Add Action group’ > provide the following details and click ‘Done’

‐?Action Name: Approval

‐ Action Provider: Manual Approval

‐ SNS Topic: Select the SNS topic created in the previous section

5.?????? Verify that the ‘ManualApproval’ stage is created for the code build stage

4.5 Adding Deploy Stage

  1. Click on ‘+Add stage’ next to the ManualApproval Stage

  1. Provide the Stage Name as “DeployToECS” and click on ‘Add Stage’
  2. Click on ‘Add Action Group’ > enter the following details and click on ‘Done’

‐?Action Name: Deploy

‐?Action Provider: Amazon ECS

‐?Region: <<select your region>> (here it is Asia Pacific (Mumbai))

‐?Input Artifact: <<select your input artifact>> (here it is MyAppBuilt from the Previous CodeBuild Stage)

‐?Cluster Name: <<Enter your cluster name>>

‐ Service Name: PROJECT-SERVICE

4.?????? Verify that the ManualApproval stage is created for the DeployToECS and click on ‘Save’

5.Testing the CodePipeline

Test the CodePipeline by doing a Code Push to the respective branch

  1. Make a Code Change and do a Code Commit and Code Push

Note: Kindly make sure the correct branch is selected?

  1. Verify the Webhook is triggered in the Azure repository

  1. Log in to AWS Management Console, navigate to CodePipeline > verify the CodePipeline is started and ‘In Progress’ state

  1. Monitor the Pipeline progress and verify the application status
  2. If the status shows ‘Successful’, the application is successfully deployed in the ECS Cluster

6.Reference Links:

  1. https://github.com/aws-samples/aws-codepipeline-third-party-git-repositories
  2. https://docs.aws.amazon.com/prescriptive-guidance/latest/patterns/use-third-party-git-source-repositories-in-aws-codepipeline.html
  3. https://kbild.ch/blog/2020-11-11-custom_codepipeline_source/

要查看或添加评论,请登录

Stovl Consulting的更多文章

社区洞察

其他会员也浏览了