Integrating Testing, Security Tasks, and
Other Tools

Integrating Testing, Security Tasks, and Other Tools

Now that we've learned the fundamentals of build and release pipelines, it's time to explore how Azure Pipelines can be enhanced with additional tools to perform more advanced tasks and incorporate capabilities beyond its default features. By the end of this chapter, you'll be equipped to extend pipelines by integrating tasks that enhance code quality, preemptively detect vulnerabilities pre-deployment, and incorporate source code and artifacts from external repositories.

Throughout this post, we will cover the following topics:

- Understanding the Azure DevOps extensibility model

- Implementing automated tests within your builds

- Enhancing code quality

- Integrating with Jenkins for managing artifacts and release pipelines

Understanding the Azure DevOps extensibility model

Azure DevOps and its sub-services come with a variety of built-in features, but you can enhance and customize your experience through extensions developed using standard technologies like HTML, JavaScript, and CSS.

Each sub-service within Azure DevOps operates on a highly flexible model that can be enriched using extensions created by individuals or reputable third-party organizations available in the marketplace. Should you not find a suitable extension, you also have the option to develop and publish your own.

Extensions serve the purpose of encapsulating reusable tasks, incorporating external tools, and improving the user interface of Azure DevOps itself. Specifically for Azure Pipelines, extensions can:

- Simplify complex and repetitive tasks

- Facilitate the use of common Infrastructure as Code (IaC) tools such as Terraform or Ansible

- Integrate with Software as a Service (SaaS) products to enhance code quality and security

- Streamline deployment processes to cloud platforms like Azure and Amazon Web Services

To explore available extensions, visit the Visual Studio Marketplace for Azure DevOps at https://marketplace.visualstudio.com/azuredevops .

Below is a screenshot demonstrating how these extensions appear in action:

Visual Studio Marketplace for Azure DevOps
Installing a code quality assessment tool, SonarQube

Search the marketplace for SonarQube and click on the listing to see its

details. Alternatively, go to SonarQube - Visual Studio Marketplace

Below is a screenshot demonstrating how these extensions appear in action:

Visual Studio Marketplace listing for the SonarQube extension

"After selecting the desired extension, click on either the 'Get it free' or 'Get' button. You'll then be prompted to choose the Azure DevOps organization where you wish to install it. If you manage multiple organizations, you'll see the options displayed as depicted in the following figure."

Installing the SonarQube extension from the marketplace

After reviewing the permissions and terms of service, and selecting the organization for installation, simply click the 'Install' button. The extension usually installs within a few seconds. Once installed, you can choose to proceed to your Azure DevOps organization or return to the marketplace to explore additional extensions.

Including automated tests for your build

Validation is essential for ensuring the functionality of modern applications, regardless of the number of developers involved in coding concurrently. Automated tests play a crucial role by validating the application immediately after it's built, ensuring no degradation in quality or introduction of bugs with recent changes.

Various types of tests, including unit tests, integration tests, and load tests, can be executed against an application. The choice of automated testing frameworks depends on the programming language used and the preferences of the development teams.

The significance of automated testing lies in its ability to minimize the risk of releasing bugs into applications. By detecting issues early in the development cycle, automated tests reduce the time required by testing teams for verification. This approach also saves human effort in repetitive tasks, allowing teams to focus more on developing additional features and enhancing application capabilities.

In the upcoming section, you'll discover how to integrate unit tests into your build pipeline using the NUnit test framework with a sample C#.NET application built on .NET Core 6.0. Assume you're working within a Visual Studio solution that includes a 'CalculusService' Class Library project, alongside its associated test projects. Below is an example code snippet from the Class Library project:"

namespace CalculusService

{

public class Additions

{

public int Add(int number1, int number2)

{

return number1 + number2;

}

}

}

The following corresponding unit test is defined in a separate test project. Make sure you have a reference to the Nunit, NUnit3TestAdapter, and NUnit.Analyzers NuGet packages:'

namespace CalculusService.Tests

{

[TestFixture]

public class AdditionsTests

{

private Additions additions;

[SetUp] public void Setup()

{ additions = new Additions();

}

[TestCase(1, 2, 3)]

[TestCase(2, 4, 6)]

[TestCase(5, -10, -5)]

public void TestAdd(int number1, int number2, int result)

{

Assert.That(result, Is.EqualTo(additions.Add(number1, number2)));

}

}

}

You will need to use a YAML pipeline, as shown in the following code snippet, to build and execute the automated tests:

The most important component of this pipeline is its final step, utilizing the VSTest@2 task. This task is a versatile, built-in feature of Azure Pipelines designed for executing unit and functional tests. It supports various test frameworks that leverage Visual Studio’s Test Explorer. Ensuring the codeCoverageEnabled property is set to true is crucial as it enables the collection of data indicating the code coverage of the application.

Pro Tip: When configuring the VSTest task, particularly when using the testAssemblyVer2 attribute, it's essential to specify a list of patterns that accurately locate the test assemblies for test execution. Failing to do so may result in obscure error messages.

Upon executing unit tests and enabling code coverage, the results will be visible in the Summary window:"'

The benefit of using this task is that it provides support for automatically publishing test results and UI reporting built into Azure Pipelines, as shown in the following screenshot:

The test results included in the Azure Pipelines run

"The VSTest@2 task also supports more advanced functionalities, such as executing tests in parallel across multiple agents, which is beneficial for large test suites, and running UI tests that require specific agent configurations.

For applications developed in other programming languages, it's essential to use the corresponding test runner and ensure the test results are published in formats compatible with the PublishTestResults@2 task. This ensures that the results can be imported and integrated into the user interface.

Now that we've covered automated test execution, let's delve into enhancing code quality.

Enhancing Code Quality

Developers often rely on automated tools to maintain high code quality and security, as they are frequently occupied with meeting deadlines and adding features to their applications.

There are two critical areas to consider in this context:

1. Static Application Security Testing (SAST): Detects vulnerabilities within the code itself.

2. Software Composition Analysis (SCA): Identifies vulnerabilities in external packages and libraries used by the application.

Benefits of Using Code Quality Tools

Integrating these tools early in the development process helps identify bugs and vulnerabilities before they impact end-users, potentially saving time and resources.

There are numerous third-party tools available for scanning and evaluating code quality. In this post, we focus on SonarQube due to its popularity and user-friendly interface. SonarQube aids developers in producing clean code by pinpointing bugs, security issues, maintainability problems, and duplicate code, among other aspects.

SonarQube offers various pricing tiers, starting with the free Community Edition, which we'll use for the examples in this chapter. Advanced features and broader language support are available in higher-paid tiers.

Other notable tools include Checkmarx, Veracode, OWASP, WhiteSource, and HP Fortify. While this book doesn't compare these tools, you can find plenty of comparisons online.

Setting Up SonarQube Analysis

To integrate SonarQube analysis into your pipeline, follow these steps:

1. Configure a SonarQube project.

2. Create a service connection to SonarQube in Azure DevOps.

3. Set up an Azure pipeline to analyze your code.

We'll walk through these steps in the upcoming sections.

Configuring a SonarQube Project

In your SonarQube instance, create a new project using the wizard. Choose the 'From Azure DevOps' option to facilitate integration, as illustrated in the following figure:

Creating a project in SonarQube using the From Azure DevOps option

If this is your first time setting up a connection to Azure DevOps, SonarQube will request the following details: Configuration name, Azure DevOps URL, and Personal Access Token. These details are necessary for configuring the project.


Create a configuration

Then, select the Azure DevOps project to configure on the SonarQube side by selecting it from the available list and clicking on the Set up selected repository button:

Selecting an Azure DevOps project in SonarQube
Creating a service connection to SonarQube in Azure DevOps

The next step involves establishing a service connection to the SonarQube instance. This enables Azure Pipelines to utilize the SonarQube extension and interact with the SonarQube instance.

To create and manage service connections, navigate to the Azure DevOps Project Settings, and click on the "New service connection" button. This will display a list where you can search for the desired option by typing a few characters in the search field. Typing "Sonar" should bring up the SonarQube option.

The SonarQube service connection option

The next step is to provide details in the Server URL, Token, Service connection name, and Description (optional) boxes. Don’t forget to check the Grant access permission to all pipelines option, unless you want to manage access to the service connection separately for each pipeline:

Service connection details for SonarQube
Creating an Azure Pipeline to analyze your code

The next step is to include two tasks available that are in the SonarQube extension in your build pipeline. Let’s look at the following pipeline:

Reviewing SonarQube analysis results

The results of the security scan will be available in the SonarQube portal. Depending on the nature of your project, you will get different quality indicators and recommendations to improve your code, as shown in the following figure:

SonarQube analysis results

Integrating with Jenkins for artifacts and release pipelines

In this section, we will walk through a straightforward setup that demonstrates how to connect Azure Pipelines and Jenkins. This connection will allow you to download an artifact generated in Jenkins and deploy it using release pipelines.

A Jenkins job is similar to an Azure Pipeline; it is an automated sequence of steps that perform actions and can produce artifacts or carry out deployments. Let’s learn how to create a simple Jenkins job.

Creating a Jenkins job that produces an artifact

This scenario assumes that we have a project called PacktFamily in a Jenkins server, as shown in the following figure:

A Jenkins instance with a PackFamily project

The configuration for the Jenkins job is very simple in this scenario, solely to demonstrate the ability the download an artifact on the Azure Pipelines side. The following figure shows the build steps for producing artifact.txt:

The build step in a Jenkins job for creating the artifact

The following figure shows the post-build actions for artifact.txt:

Post-build action publishing the Jenkins artifact

The execution of the Jenkins job will yield a single artifact that can be downloaded by Azure Pipelines

Jenkins job results and artifacts
Creating a service connection to Jenkins in Azure DevOps

This process is similar to the one discussed in the "Creating a Service Connection to SonarQube in Azure DevOps" section.

In the Azure DevOps project settings, navigate to the Service Connections section and click the "New service connection" button. Use the search box to type "Jenkins," then click "Next."

The New Service connection dialogue

Provide the Server URL, Username, Password, and Service connection name details. Don’t forget to check the Grant access permission to all pipelines box if needed. Finally, click the Verify and Save button to proceed:

The New Jenkins service connection dialog
Creating a release pipeline to use Jenkins artifacts

Now, it is time to configure a release pipeline. You can follow these steps:

  1. Navigate to Project > Pipelines > Releases and click on New Release Pipeline. You will be presented with the option to select a template, as shown in the following screenshot. We will begin with an empty job:

Selecting a template

2. Click on the "Add an artifact" widget and select the Jenkins option. This allows you to use the previously created service connection to choose the project in Jenkins from which you will use artifacts.

Simply select the service connection that matches the name you created in the previous step, and then choose the corresponding Source (job) option.

Adding a Jenkins artifact

3. You have the option to change the Source alias detail, which determines the directory where artifacts will be downloaded once the pipeline executes. This is important when dealing with multiple artifacts from different sources, as it prevents files from being overwritten. In this case, the default value will suffice.

4. Next, we can add steps to the Deploy stage to verify and display the contents of the artifact. Click on the "1 job, 0 task" option in the Deploy stage to customize the pipeline. For this scenario, we will use a Linux agent. By clicking on the "Agent job" option, as shown in the screenshot below, you will access the Agent selection section. Here, select "Azure Pipelines" from the Agent pool dropdown and "ubuntu-latest" from the Agent Specification dropdown within the Agent selection section.

Selecting an agent in the Deploy stage

5. Once you’ve done this, click the "+" button on the right-hand side of the Agent Job section to search for and add the Command Line task. This task can execute a custom script on the agent and will switch to the appropriate underlying process based on the operating system.

The command-line task to list the contents of the artifact

This script will do the following:

  • List the contents of the current directory, which should be where the agent is running the current pipeline
  • Move into the directory where the Jenkins artifacts were downloaded
  • List the contents of the current directory, which should be where the Jenkins artifacts were downloaded
  • Print a label indicating the contents of the file that will be displayed
  • Print the contents of the artifact.txt file


6. After saving the pipeline and creating a release to execute it, you should be able to see that it successfully downloads the artifacts from Jenkins and lists the contents of the file, as shown in the figure below:

The logs of the pipeline downloading Jenkins artifacts

In this post, we explored the extensibility model of Azure DevOps and how the marketplace of extensions makes it easy to add features to your build and release pipelines. This enhances your ability to create and integrate pipelines with other tools efficiently.

We also learned how to improve application quality by integrating automated tests and security scans, which alert developers to any issues or vulnerabilities. This reduces the time needed to find and fix bugs and mitigates security risks before deploying applications to production.

Additionally, we covered how to integrate Azure Pipelines to download artifacts from another CI/CD tool for deployment, which is useful in hybrid setups where different teams use different CI/CD tools. Finally, we discussed the flexibility of Microsoft-hosted agents in Azure Pipelines, allowing you to implement CI/CD needs without managing the infrastructure.

In the next post, we will learn about monitoring Azure DevOps Pipelines, which are an important task that will ensure everything is working correctly and that if things go wrong, we get the visibility needed to fix them promptly.        

Microsoft Azure Microsoft Learn Microsoft Azure DevOps















要查看或添加评论,请登录

社区洞察

其他会员也浏览了