Azure DevOps Best Practices: Enhancing Collaboration and Security
Azure DevOps Services

Azure DevOps Best Practices: Enhancing Collaboration and Security

Azure DevOps - Identity & Governance

Introduction

Once you begin growing as an organization and having multiple teams, you have many developers working together. One of the trickiest things about cloud in general is how to give them the flexibility to iterate and deploy fast, but at the same time apply some security boundaries to meet compliance regulations or just have that check in place, so that you know one team is not going to blow up another team.

One example to see this is fruits and vegetables in the supermarket:

No alt text provided for this image
Azure DevOps Services - Governance Demo

Default Isolation Model

The organization of this example is called “governance-demo” and it contains many different projects. The project called “projects-fruits” is isolated for just the team of "fruits" and the project called “projects-veggies” is isolated just for the team of "veggies".

No alt text provided for this image
Default Iolation Model

The idea is that using the default project isolation, you give your teams control over their own project which contains repos, pipelines, and they are given the flexibility to do whatever they want. But obviously you do lose some collaboration, because by default those two teams ("fruits" and "veggies") cannot communicate with each other.

One of the biggest requests is how do we collaborate with a shared backlog and build features together. If you were to keep that sort of isolated model, for code and deploying, you are going to keep that project separation (“projects-fruits” and “projects-vegetables”), and you are going to have a separate project that only has Azure Boards enabled (in this example it’s called “shared collaboration”).

No alt text provided for this image
Default Collaboration Model

You could do it that way and save yourself the security overhead of managing permissions when it comes to code and pipelines. The caveat here is that you can't link code commits to work items in a different project.

Default Collaboration Model

What many organizations then try to do is a "supermarket" model:

No alt text provided for this image
"Supermarket" Model

When you have the "supermarket" model (by default collaboration), that means you must apply permissions to lock everything down for those individual teams. That is a bit trickier because some of the decisions that you make up front are harder to reverse.

A must tip – Azure Active Directory integration and Azure Active Directory groups

When you have an organization, don't add users under the “Organization Settings” -> “General” -> “Users” tab.

No alt text provided for this image
"Users" tab in Azure DevOps

The proper way to add your users is to integrate Azure Active Directory (AAD). So, you apply AAD identities both for authentication (who you are) as well as for authorization (what you are allowed to do). They all map back to AAD users and groups.

No alt text provided for this image
Azure Active Directory Integration in Azure DevOps

The biggest advantage to this is that your Azure DevOps (ADO) teams, where you can apply ADO permissions come from AAD (e.g., assign the “Project Administrators” ADO role to give them admin rights). So, if somebody leaves a particular role or they leave a particular company, you only must update what their role is or what group memberships they have in one place only and that's your AAD. You do it once and then both on the Azure Resource Manager (ARM) side as well as the ADO side, that person no longer has administrator or contributor rights.

Azure DevOps - Azure Pipelines

Use YAML, not Classic UI

The next tip is to use “Pipelines as Code”, which is YAML pipelines. Classic pipelines are legacy from Team Foundation Server (TFS). If you're using them right now, they are not going away, but try to migrate to YAML, when you can, because industry going forward is with Pipeline as Code.

When you're making pipelines, there are differences between classic pipelines and YAML pipelines, because YAML can't do everything a classic pipeline can do in that same way. If you look at this table and compare, especially when it comes to security, there's a way to do everything you probably want with YAML as well, it's just a little bit different compared to Classic UI.

Variables in YAML

Many engineers from the very beginning want to put everything in a variable. But, when you don't know what the pipeline's supposed to look like, or how you want to separate your resources, consider putting the variables as close as possible to the code. One of the pro tips is putting the “pipeline conditions” of your YAML pipeline in a variable at the very top:

No alt text provided for this image
Variables in YAML

Variables in UI

If you look at a pipeline in the ADO UI, you'll also see this “Variables” tab, where you can also define variables to be used from your CI / CD / IaC YAML pipelines. That “Variables” feature is there, mainly because you could mark something as a secret in ADO and not actually put it inside the pipeline YAML code itself.

Using Library Groups & Prefixes

The preferred way to do this is, first don't use secrets in your pipelines, because you should be using service connections theoretically if you can. And then, you would use “Libraries”, which will load a bunch of variables into the pipeline. You can also apply a “lib-*” prefix, so that when you are debugging your YAML pipeline, you know where the variable is coming from. If it has the “lib-*“ prefix, then it' s coming from the library, if it doesn't have it, then it's probably defined in the YAML itself directly.

Key Vault Integration

For handling secrets in your YAML pipelines, you can use “key vault integration”, and all of these variables can also start with a “kv-*” prefix, so that you know they're in Key Vault.

Use “Service Connections” (aka credentials) for modern apps

To protect your Azure Resources, you would have to protect the credentials. And what you'll see in ADO is something called “Service Connections”. When you're thinking of environments, just think of service connections.

You can have service connections, for example, to GitHub, to docker hub, to a private container registry and of course you can also have a service connection to ARM. In terms of understanding service connections, you can think of them as just credentials.

Some best practices around service connections include:

  • Number of connections: Create separate service connections for different environments (e.g., development, staging, production) to ensure proper access control and separation of concerns.
  • Scope: Choose the appropriate scope for each service connection based on the principle of least privilege. Limit the scope to the specific resources or resource groups that the connection needs to access.
  • Security: Secure your service connections by using AAD service principals and role-based access control (RBAC). Assign the minimum required permissions to the service principal to reduce the risk of unauthorized access.

Approvals and Checks

The classic pipelines have a nice UI, where you can say things like before you deploy to production, go through a few approvals and checks. In Azure Pipelines you can do those as well.

No alt text provided for this image
Approvals and Checks

If you click on “Approvals and Checks” on any of your service connections in ADO, you can see the different options you have for checks. Business hours are common, as you don't want to deploy to production in the middle of night per se when nobody is around to respond to issues. You can have approvals as well from a particular person. Basically, what those do is that ADO will not release access to those credentials to deploy unless a certain designated person approves it (or it is inside a particular business hour, etc.). This is how you would place a check on that by protecting that service connection.

Why not use “Automatic” Service Connections

The reason you maybe consider not adding service connections “automatically” when prompted (even if the UI says “Recommended” – it says that because it’s easy, it just generates it for you without having to think about it), is because it doesn't scale well. You end up in your AAD with a bunch of app registrations (service principals) that might have the same name and they don’t tell you much about what they are, what do they scope to, what they are doing, etc.

What you want is to create service connections (which remember are just credentials), that you can tell what they are doing by just looking at their name.

No alt text provided for this image
Why not use “Automatic” Service Connections

How to share service connections across projects

Another thing you can do with service connections is that you can share them across projects. For example, if you have a central team that is maybe providing a central resource for auditing reasons, you can go to the service connection in ADO, click on “Security” and then you can give it access to another project by clicking on “Add Project” under “Project Permissions”.

No alt text provided for this image
Sharing Service Connections across Projects

Allow list for Pipelines

The service connection cannot yet be used if there are no permitted pipelines to use it. You can find this information, under “Pipeline permissions” section inside the service connection page in ADO.

No alt text provided for this image
Pipeline permissions

So, when you create a service connection doesn't mean that this credential is automatically available to every code project or pipeline in your ADO project. Specific pipelines will have access to it if you grant access to the pipeline once. It's not going to ask you every time.

Why you can’t share service connection across organizations

An organization in ADO is a logical boundary and it's a governance boundary. The service connections themselves are really scoped to that organization. It's important to understand as well that when you create service connections, those credentials are in ADO and so if you did have the requirement to add the same service connection or like credentials to multiple ADO organizations, you have the challenge of having to manage multiple copies of that.

One way to get around that is actually to use Azure Key Vault, so you'll have to create a library, which will be integrated with Azure Key Vault, and you will put the credentials you want to share there and not create them as service connections.

Use Multiple Pipelines

You can always have multiple pipelines per repository. You can have a single file for everything, but as your workflows become more complex, it might be easier just to split your pipelines in CI / CD / IaC / Detect-Drift, etc.

Agent Pool Selection

When using Azure DevOps, you have the option to choose between Microsoft-hosted agents and self-hosted agents:

Microsoft-hosted agents are managed by Microsoft and are automatically updated with the latest tools and patches. They are suitable for most scenarios, as they require minimal setup and maintenance. However, they may have limitations in terms of available resources and customization.

Self-hosted agents, on the other hand, are managed by you and can be customized to meet your specific requirements. They provide more control over the environment and resources but require more setup and maintenance. Self-hosted agents are recommended when you have specific requirements that cannot be met by Microsoft-hosted agents, such as custom software or network configurations.

In general, consider starting with Microsoft-hosted agents for simplicity and ease of use. If you encounter limitations or have specific requirements, consider using self-hosted agents.

Conclusion

In conclusion, ADO provides robust features for identity and governance, allowing organizations to effectively manage their teams, projects, and resources. By leveraging default isolation models, teams can maintain control over their projects while still ensuring collaboration and shared backlogs through alternative approaches like the "supermarket" model. Integrating AAD for user management provides a centralized and efficient way to handle authentication and authorization, reducing administrative overhead.

Using YAML pipelines as code instead of the Classic UI enables greater flexibility and scalability in managing pipelines, and variables can be strategically placed either in YAML or within libraries for efficient resource management.

Service connections play a crucial role in securing credentials and enabling seamless integration with various platforms, while approvals and checks add an extra layer of control and governance to the deployment process. It is recommended to carefully manage service connections and pipeline permissions to ensure proper access control and avoid cluttering Azure Active Directory with unnecessary app registrations. Organizations can share service connections across projects and leverage Azure Key Vault integration for sharing credentials across multiple Azure DevOps organizations.

Finally, utilizing multiple pipelines, choosing the appropriate agent pool (Microsoft-hosted or self-hosted), and regularly updating pipeline configurations based on evolving workflows and requirements contribute to an efficient and reliable development and deployment process within Azure DevOps.

要查看或添加评论,请登录

Orestis Meikopoulos的更多文章

社区洞察

其他会员也浏览了