Azure Repository integration with AWS CodePipeline
This article demonstrates the process of integrating Azure DevOps Repository into the AWS CodePipeline and deploying the application on the ECS cluster. It gives a walkthrough of configuring the following:
Application Overview:
A Sample Nginx container code is hosted on the Azure Repository. Our aim is to deploy this container on the Amazon ECS Cluster.
AWS Services and other components used in this solution:
Architecture
Understanding the architecture
Pre-requisites:
?
Let’s get started!
1. Configuring Azure Git Repository
1.1??? Create SSH Keys
-Login to Bastion host or any Linux host. Run the following command and follow the onscreen instruction to complete the procedure:
ssh-keygen -t rsa -b 4096 -C “<your email address>”
-Navigate to the SSH directory and copy the content of <filename.pub> (use the filename provided during the key generation)
cd ~/.ssh
ls
authorized_keys config xxxx.pub known_host
1.2??? Add SSH key in Azure Repository
-????????? Login to Azure DevOps console and navigate to ‘User settings’ > create a new key by providing the SSH public key generated in the previous step, and click on ‘Add’
2.Configuring AWS Code Pipeline with 3rd party Source Stage
2.1.Publishing SSH key to AWS Secret Manager
-Login to the bastion host and execute the following command. It will create a Secret Value in Secrets Manager and set the respective ARN in an environment variable.
export SecretsManagerArn=$(aws secretsmanager create-secret --name <name of secrets> --secret-string file:///home/ec2-user/.ssh/xxxx - -query ARN --output text)
Note: This ARN will be required later in the process.
?2.2.? Launching CloudFormation Stack
The CloudFormation stack will be launched with a pre-configured template. This template will create the following:
‐?CodePipeline Custom Action Type
‐Lambda Function associated with the above Custom Action
‐Code Build Project to perform the Git Clone and push the code to S3
‐Lambda Execution Role
‐Code Build Execution Role
The above resources will be used to connect to the Azure Git Repository and get the Code S3
?1. Clone the git repository aws-codepipeline-third-party-git-repositories that contains the AWS CloudFormation templates and AWS Lambda function code using the below command:
git clone https://github.com/aws-samples/aws-codepipeline-third-party-git-repositories.git .
2. Upload the Lambda function code to an S3 bucket in the same Region where the stack is being deployed
?3. To create a new S3 bucket, use the following code (make sure to provide a unique name):
export ACCOUNT_ID=$(aws sts get-caller-identity --query Account --output text)
export S3_BUCKET_NAME=codepipeline-git-custom-action-${ACCOUNT_ID}
aws s3 mb s3://${S3_BUCKET_NAME} --region ap-south-1
Note: Save the S3 bucket name. It will be required in the later steps.
4. Execute the following command from the root directory where git was cloned to upload the Lambda function to S3 bucket:
export ZIP_FILE_NAME=”codepipeline_git.zip”
zip -jr ${ZIP_FILE_NAME} ./lambda/lambda_function.py && \
aws s3 cp codepipeline_git.zip \
s3://${S3_BUCKET_NAME}/${ZIP_FILE_NAME}
Note: Save the file name. It will be required in the later steps.
?5. To create the Custom Source Stage for the CodePipeline, run the following command with the correct details (such as VPC ID, subnet ID, bucket name etc.):
Note: This stack must be created only once. The same Custom Source Stage can be used in multiple code pipelines. There is no need for creating separate Custom Stages for each microservice.
export vpcId="vpc-xxxxxxxxxx"
export subnetId1="subnet-xxxxxxxx"
export subnetId2="subnet-xxxxxxxxx"
export GIT_SOURCE_STACK_NAME="third-party-codepipeline-git-source"
aws cloudformation create-stack \
--stack-name ${GIT_SOURCE_STACK_NAME} \
--template-body file://$(pwd)/cfn/third_party_git_custom_action.yaml \
--parameters ParameterKey=SourceActionVersion,ParameterValue=1 \
ParameterKey=SourceActionProvider,ParameterValue=CustomSourceForGit \
ParameterKey=GitPullLambdaSubnet,ParameterValue=${subnetId1}\\,${subnetId}\
ParameterKey=GitPullLambdaVpc,ParameterValue=${vpcId} \
ParameterKey=LambdaCodeS3Bucket,ParameterValue=${S3_BUCKET_NAME} \
ParameterKey=LambdaCodeS3Key,ParameterValue=${ZIP_FILE_NAME} \
--capabilities CAPABILITY_IAM
?6. Navigate to AWS CloudFormation Console and wait for the stack to complete the provisioning.
7. Navigate to AWS Code Pipeline using the management console and verify the availability of custom stage.
3.Configuring AWS CodePipeline for deploying the sample docker container
?Pre-Requisites:
?o For Azure DevOps source IP Address range, it is required to whitelist the specific IP Range to allow invoking the Amazon CodePipeline via webhook. This is to establish connectivity from Azure DevOps Services to on-premises/ other cloud services. As per the Azure Documentation, the Source IP Address range used for connectivity in this example is 20.37.158.0/23
o? Git repository SSH_URL for the sample nginx container repository
o? Git Branch to trigger the pipeline
o? ARN of the Git SSH Private stored in Secrets Manager created in the previous section
o? A Task definition must be created using the AWS Management Console. This task definition will be updated with the latest image URI/tag and deployed as part of the Amazon CodePipeline
?3.1 Updating the CloudFormation stack
This section will make necessary changes to the CloudFormation template to accommodate the environment. (Note: It varies from case to case)
?3.1.1 Updating the Pipeline webhook resource - AWS:CodePipeline::Webhook
a. Update the JsonPath as give below ?
? PipelineWebhook:
??? Type: "AWS::CodePipeline::Webhook"
??? Properties:
????? TargetPipeline: !Ref Pipeline
????? TargetPipelineVersion: 1
????? TargetAction: Source
????? Filters:
??????? #- JsonPath: '$.ref'
??????? - JsonPath: "$.resource.refUpdates..name"
????????? MatchEquals: 'refs/heads/{Branch}'
????? Authentication: IP
????? AuthenticationConfiguration:
??????? AllowedIPRange: !Ref GitWebHookIpAddress
????? RegisterWithThirdParty: false
?3.1.2 Updating the Code Build Project resource – AWS:CodeBuild::Project
?? CodeBuild:
??? Type: 'AWS::CodeBuild::Project'
??? Properties:
????? Artifacts:
??????? Type: CODEPIPELINE
????? Environment:
??????? ComputeType: BUILD_GENERAL1_SMALL
??????? #Image: aws/codebuild/java:openjdk-8
??????? Image: aws/codebuild/amazonlinux2-x86_64-standard:4.0
??????? PrivilegedMode: true
??????? Type: LINUX_CONTAINER
????? ServiceRole: !Ref CodeBuildRole
????? Source:
??????? Type: CODEPIPELINE
??????? BuildSpec: |
??????? ??????version: 0.2
??????? ??????phases:
??????? ????????install:
??????? ????????????runtime-versions:
??????? ????????????????python: 3.7
??????? ????????????# commands:
??????? ????????????# - pip3 install boto3
??????? ????????build:
??????? ????????????commands:
??????? ????????????- ls -al
Note: build spec will be provided as part of the source code?
3.1.3 Creating the Buildspec yaml
1.Create a file named “buildspec.yaml” in the source code root directory
2.Make sure the file is updated with the corresponding variables
a.ECR_URI: <mention your ECR-URI>
b.ECS_CLUSTER: <Your cluster name>
c.SERVICE_NAME: <Your service name>
d.TASK_DEFINITION_NAME: <Your task definition name>
version: 0.2
?env:
? variables:
??? CODEBUILD_BUILD_NUMBER: $CODEBUILD_BUILD_NUMBER
??? NEW_REVISION: $NEW_REVISION
??? ECR_URI: <mention your ECR-URI>
??? ECS_CLUSTER: <Your cluster name>
??? SERVICE_NAME: <Your service name>
??? TASK_DEFINITION_NAME: <Your task definition name>
? exported-variables:
??? - CODEBUILD_BUILD_NUMBER
??? - NEW_REVISION
??? - ECR_URI
??? - ECS_CLUSTER
??? - SERVICE_NAME
??? - TASK_DEFINITION_NAME
phases:
? pre_build:
??? on-failure: ABORT
??? commands:
????? - echo Logging in to Amazon ECR...
????? - $(aws ecr get-login --no-include-email --region $AWS_DEFAULT_REGION)
???? #- $(aws ecr get-login-password --region ap-south-1 | docker login --username AWS --password-stdin xxxxx.dkr.ecr.ap-south-1.amazonaws.com)
? build:
??? on-failure: ABORT
??? commands:
????? - echo Build started on date
????? - echo Building the Docker image...?
????? - echo Build Number $CODEBUILD_BUILD_NUMBER
????? - docker build -t nginx-demo:$CODEBUILD_BUILD_NUMBER .
????? - docker tag nginx-demo:$CODEBUILD_BUILD_NUMBER $ECR_URI:$CODEBUILD_BUILD_NUMBER
? post_build:
??? on-failure: ABORT
??? commands:
????? - echo Build completed on date
????? - echo Pushing the Docker image...
????? - docker push $ECR_URI:$CODEBUILD_BUILD_NUMBER
????? - echo Update ECS Task definition
????? - ECR_IMAGE=$ECR_URI:$CODEBUILD_BUILD_NUMBER
????? - TASK_DEFINITION=$(aws ecs describe-task-definition --task-definition "$TASK_DEFINITION_NAME" --region "$AWS_DEFAULT_REGION")
????? - NEW_TASK_DEFINTIION=$(echo $TASK_DEFINITION | jq --arg IMAGE "$ECR_IMAGE" '.taskDefinition | .containerDefinitions[0].image = $IMAGE | del(.taskDefinitionArn) | del(.revision) | del(.status) | del(.requiresAttributes) | del(.compatibilities) | del(.registeredBy) | del(.registeredAt)')
????? - NEW_TASK_INFO=$(aws ecs register-task-definition --region "$AWS_DEFAULT_REGION" --cli-input-json "$NEW_TASK_DEFINTIION")
????? - NEW_REVISION=$(echo $NEW_TASK_INFO | jq '.taskDefinition.revision')
????? - TASK_DEFINITION_ARN=$(echo $NEW_TASK_INFO | jq '.taskDefinition.taskDefinitionArn')
????? - echo ${NEW_TASK_DEFINTIION} > new_task_definition.json
????? - echo Writing image definitions file...
????? - printf '[{"name":" nginx-demo","imageUri":"%s"}]' $ECR_URI:$CODEBUILD_BUILD_NUMBER > imagedefinitions.json
artifacts:
? files:
??? - new_task_definition.json
??? - imagedefinitions.json
?3.2 Deploying the CodePipeline using the CloudFormation stack for nginx-demo
1.??? Login to bastion > navigate to CF directory > execute the following command with the required inputs, to launch the CloudFormation stack for project-API:
领英推荐
export SSH_URL=" <Git repository SSH_URL for the project-API repository>”
export SAMPLE_STACK_NAME="third-party-codepipeline-git-source-project-api"
export GIT_BRANCH="<git-branch-name>"
export SecretsManagerArn="<Enter the ARN of ScecretManager created in the above>"
export PIPELINE_NAME="dev-ecs-project-api"
aws cloudformation create-stack \
--stack-name ${SAMPLE_STACK_NAME} \
--template-body file://$(pwd)/cfn/sample_pipeline_custom.yaml \
--parameters ParameterKey=Branch,ParameterValue=${GIT_BRANCH} \
ParameterKey=GitUrl,ParameterValue=${SSH_URL} \
ParameterKey=SourceActionVersion,ParameterValue=1 \
ParameterKey=SourceActionProvider,ParameterValue=CustomSourceForGit \
ParameterKey=CodePipelineName,ParameterValue=${PIPELINE_NAME} \
ParameterKey=SecretsManagerArnForSSHPrivateKey,ParameterValue=${SecretsManagerArn} \
ParameterKey=GitWebHookIpAddress,ParameterValue=20.37.158.0/23 \
--capabilities CAPABILITY_IAM
2.??? Verify the stack in the CF console.
3.??? Navigate to the CodePipeline and verify that the pipeline is created.
4.??? Navigate to Build Section > Build Project and verify that the Build projects are created.
3.3 Retrieving the Webhook URL
After the Stack status changes to CREATE_COMPLETE, execute the following command on the Bastion host to get the Webhook URL
aws cloudformation describe-stacks --stack-name ${SAMPLE_STACK_NAME} --output text --query "Stacks[].Outputs[?OutputKey=='CodePipelineWebHookUrl'].OutputValue"
Note: This URL will be used to configure the Webhook in the Azure DevOps console
?3.4 Configuring the Webhook in the Azure DevOps Console
4. Additional configurations for CodePipeline
Additional configurations will be manually done using the AWS console for the nginx-demo CodePipeline
4.1 IAM permissions for the CodeBuild role
Note: For development, AWS-managed policies provide higher privileges.
For Production, create custom policies with more restrictive privileges as per the Least Privilege Best Practices.
4.2 IAM permissions for CodePipeline Role
?
4.3 Configure SNS for approval notification (Optional)
‐????????? Name: Name of the topic
‐????????? Display Name: Display Name of the topic
‐?Topic: Select the Topic created in the previous section
‐?Protocol: Select Email
‐?Endpoint: Enter the email ID/ Alias address to receive approval notifications
?4.4 Adding manual approval stage (Optional)
‐?Action Name: Approval
‐ Action Provider: Manual Approval
‐ SNS Topic: Select the SNS topic created in the previous section
5.?????? Verify that the ‘ManualApproval’ stage is created for the code build stage
4.5 Adding Deploy Stage
‐?Action Name: Deploy
‐?Action Provider: Amazon ECS
‐?Region: <<select your region>> (here it is Asia Pacific (Mumbai))
‐?Input Artifact: <<select your input artifact>> (here it is MyAppBuilt from the Previous CodeBuild Stage)
‐?Cluster Name: <<Enter your cluster name>>
‐ Service Name: PROJECT-SERVICE
4.?????? Verify that the ManualApproval stage is created for the DeployToECS and click on ‘Save’
5.Testing the CodePipeline
Test the CodePipeline by doing a Code Push to the respective branch
Note: Kindly make sure the correct branch is selected?