Creating Pipelines for Dataverse for Teams without Managed Environments

Creating Pipelines for Dataverse for Teams without Managed Environments

Dataverse for Teams is a great for creating apps, flows, and other components. It allows you to store everything in one place and collaborate with your team. However, when it comes to deploying your apps to other environments (another team), it can be challenging. With the recent release of Pipelines for Managed Environments, you can automate the deployment process. But what if you don't have access to this premium feature? ?? In this article, we will create pipelines for automating the deployment process of Dataverse for Teams components without any premium features or additional licenses. ??

?

Prerequisites

  • Microsoft 365 or Office 365 license with Power Apps Seeded license.

According to the February License Guide, it could be one of the following licenses: Office 365 E1, Office 365 E3, Office 365 E5 Office 365 F3, M365 Business Basic, M365 Business Standard, M365 Premium, Microsoft 365 F3, Microsoft 365 E3, Microsoft 365 E5, Office 365 A1, Office 365 A3, Office 365 A5, Microsoft 365 A1, Microsoft 365 A3, Microsoft 365 A5.

  • At least 2 teams in the Microsoft Teams - one for development and one for your end-users (like a production environment).

In this article, I'll use the DEV team where I've created a Canvas App application, and the PROD team where I would like to deploy my app and where users will be able to work with this app.

  • An app (or another component) created in the DEV team (in the Dataverse for Teams).

In this article, I'll use a small welcome app with the name "Pipelines for E3 demo".

  • Project in Azure DevOps.

You can use Azure DevOps for free to create pipelines if your DevOps organization includes less than 5 members.

If you don't have an Azure DevOps instance, go to https://dev.azure.com/ and start for free. I'll write a separate article with step-by-step instructions how to create a DevOps organization and a project. Coming soon!

?

Well, let's start our magic! ??

?

Step 1. Prepare the solution for the deployment process

As you know, in the Power Platform ecosystem, solutions are the mechanism for implementing ALM. You can easily create a solution in Dataverse using the maker portal. But if you open https://make.poweraps.com and try to find your Dataverse for Teams environment in the list of environments, you won't find it. Because only Dataverse that was created in the Power Platform Admin Center is listed here. In other words, Power Apps Maker Portal lists only environments with the type Trial, Developer, Sandbox, or Production. At least for now.

How to open your Dataverse for Teams environment in the Maker Portal? It's easy! There are several options for this. And since recently, the easiest way is to open your Dataverse for Teams environment directly from Microsoft Teams.

Go to the Power Apps app in Microsoft Teams. Click on Build in the top menu. Select the dedicated team and click on the three dots next to the team name. Click on the Open in Power Apps link.

No alt text provided for this image
Open Dataverse for Teams environment in the Maker Portal

The Dataverse for Teams environment will be open in a new tab. Your Dataverse for Teams environment includes three system solutions. All components that you are creating in Teams (like apps, flows, etc.) are stored in the Common Data Services Default Solution.

However, you can't use the Common Data Services Default Solution for CI/CD pipelines. You need to create a new solution and add all necessary components into your custom solution.

To create a new solution - click on the New solution button, fill the Display name for your new solution, select a Publisher and click the Create button.

No alt text provided for this image
Create a new solution in the Dataverse for Teams environment

Once the solution has been created, inside the solution, click the Add existing button, select the type of component you want to add.

No alt text provided for this image
Add existing components into your custom solution (in Dataverse for Teams environment)

On the new screen, select all components you want to add and click the Add button.

The solution in the Dataverse for Teams environment now includes necessary components and is ready to be deployed to our target environment (target team in our case).


Step 2. Configure DevOps project

To automate the deployment process we need to store the code of the solution in a repository. In this article I'm going to use Azure DevOps for this purpose.

By the way Azure DevOps is a great product not only for code storing and pipeline creation. Azure DevOps can support your ALM on all stages. Try it out for task tracking, version control and other capabilities as well.

?

We need to configure the Azure DevOps project to be sure that everything is ready for create and launch pipelines successfully.

1. Manage permissions

Go to the project settings by clicking on the Settings button. Click on the Repositories on the left hand menu. Open Security tab. Select the team under the Users section and allow for this team following options:

  • Contribute
  • Manage notes

No alt text provided for this image
Configure permissions for the Azure DevOps project

These permissions will be needed to store code in the repository and to commit changes.

2. Ensure that parallel job is allocated to the DevOps organization

Click on the Parallel jobs on the left hand menu. Check that in the Microsoft-hosted section you see the Free tier and 1 parallel job is available.

No alt text provided for this image
Check the availability of the Parallel job for the project

Otherwise you need to request the free tier to your DevOps organization by submitting this form: https://aka.ms/azpipelines-parallelism-request.

3. Initialize a branch in the repository

Go to repository by clicking Repo on the left hand menu. Click the Initialize button.

No alt text provided for this image
Initialize the branch in the Azure DevOps Repository

The branch main will be created and you'll see the Read.Me file of this branch. Later on if needed you will be able to customize this read.me file.

4. Install Power Platform Build Tools

The last preparation step for our DevOps is Power Platform Build Tools installation. This extension provides actions that we will need to configure our pipelines.

To install the Power Platform Build Tools to you Azure DevOps organization - click on the Marketplace button and choose Browse marketplace option.

No alt text provided for this image
Open Marketplace

On the Marketplace page in the search field type "Power Platform" and press Enter. Choose the Power Platform Build Tools extension.

On the next screen click Get it free, and then click the Install button.

After the installation process will be completed - go back to your Azure DevOps project.

Keep in mind that Power Platform Build Tools extension installs to the entire Azure DevOps organization. So your next projects you'll create in this particular DevOps organizations will automatically have permissions for Power Platform Build Tools without additional its installation.


Step 3. Create the first pipeline in Azure DevOps to store the code in the Repo

Now that we have our code repository and our solution prepared, we can create our first pipeline in Azure DevOps.

Go to Pipelines by clicking this button on the left hand menu and click the Create Pipeline button.

No alt text provided for this image
Create a pipeline in Azure DevOps

As a truly Citizen Developers we are going to use drag and drop interface instead of code (even if it is a simple YAML code) ?? That's why click on Use the classic editor on this screen to start creating the pipeline from scratch without a code.

No alt text provided for this image
Use the classic editor for creating a pipeline

On the next screen check that everything selected correctly. Usually preselected options are correct, especially for the new project when you have just one team, one branch, etc. Click the Continue button to proceed.

On the next screen choose the Empty job as we are going to start from scratch.

No alt text provided for this image
Choose Empty job for your pipeline

The new Pipeline is ready to be configured. Before we proceed with adding steps let's make a few configurations.

Go to the Get sources scroll down and uncheck the Shallow fetch option.

No alt text provided for this image
Uncheck Shallow fetch option

Then go to the Agent job 1 and check the Allow scripts to access the OAuth token option to allow your pipeline commit changes.

No alt text provided for this image
Check the Allow script to access the OAuth token

And now we are ready to add steps in our first pipeline. We are going to use Power Platform Build Tools for the most of steps.

Click on the plus icon, in the search field type power platform and select following steps in the sequence:

- Power Platform Tool Installer - this step installs Power Platform Build Tools extension to your pipeline to allow use the extension's actions

- Power Platform Export Solution - this step exports particular solution from your environment

- Power Platform Unpack Solution - this step stores the solution's code in the Azure DevOps repository

No alt text provided for this image
Add Power Platform Build Tools steps

Then type command line in the search field and add the Command Line step. This step commit changes that you made in your Azure DevOps repository.


All steps have been added. Let's configure them.

1. Configure step Power Platform Export Solution

In this step we should provide information about the environment where our solution is storing, authentication information and solution's name.

Authentication

You can use one of 2 available authentication options - select the appropriate one. In this article I'm going to use Username/password. To configure authentication create a service connection. Click Manage and then click Create service connection in the new window.

No alt text provided for this image
Create a Service connection

Choose Generic and click Next.

Provide details for the new service connection:

Copy Server URL value from the Developer Resources from the Power Apps Maker portal.

Provide your User name and Password, as well as Service connection name. Once you are ready, click Save.

No alt text provided for this image
Provide information for the new Service connection

Go back to your pipeline, refresh the Service connection field and select the connection that you've created.

Solution Name

This field will store the solution name. Following best practices we will store it in the variable and reuse the variable in the pipeline. So, in this field provide the variable name:

$(PowerPlatform.SolutionName)        

Solution Output File

This field provides the path and file name of the generated zip file once the solution will be exported. This information we will store in the variable as well. Type in this field following path:

$(Build.ArtifactStagingDirectory)\$(PowerPlatform.SolutionName).zip        

The configured step should looks as following:

No alt text provided for this image
Configured step Power Platform Export Solution

2. Configure step Power Platform Unpack Solution

In this step we should provide variables for two fields:

- Solution Input File (the path where we stored exported solution):

$(Build.ArtifactStagingDirectory)\$(PowerPlatform.SolutionName).zip        

- Target Folder to Unpack Solution (the path to our repository):

$(Build.SourcesDirectory)\$(PowerPlatform.SolutionName)        

The configured step should looks as following:

No alt text provided for this image
Configured step Power Platform Unpack Solution

3. Configure step Command Line Script

This step will commit our changes in the repository branch. To commit provide the following code in the Script field:

echo Commit Power Platform Solutio

git config user.email "[replace with your email]"

git config user.name "[replace with your name]"

git checkout main

git pull origin

git add --all

git commit -m "Automatic solution commit"

git -c http.extraheader="AUTHORIZATION: bearer $(System.AccessToken)" push origin mainn        

The configured step should looks as following:

No alt text provided for this image
Configured step Command Line Script


All steps are configured. One last thing that we need to do before we run our pipeline is to specify the value of the variable that stores the solution's name.

Go to Variables and click on the Add button. Provide the variable name (PowerPlatform.SolutionName) and its value. The value for this variable should be the Name of your solution (not Display Name).

No alt text provided for this image
Configure variable

?

Now everything is ready in the first pipeline and we can run it to store the code in the repository.

Click on the Save & queue button and then on the button with the similar name in the drop-down menu. Add a comment and click Save and run.

No alt text provided for this image
Save and queue


The pipeline has been started. You can click Agent Job 1 to open the details.

?

Once the pipeline finish its job you'll receive an email notification with the status, all steps will be marked according to its status as well, and in the Azure DevOps Repo the code will appear.

No alt text provided for this image
The code has been stored in the repository


Step 4. Deploy the solution to the target Dataverse for Teams (production)

For the deployment purpose let's create the second pipeline. The process is the same, steps for the second pipeline are following:

- Power Platform Tool Installer

- Power Platform Pack Solution

- Power Platform Import Solution

1. Configure step Power Platform Pack Solution

In this step we should provide variables for two fields:

- Source Folder of Solution to Pack:

$(PowerPlatform.SolutionName)        

- Solution Output File:

$(Build.ArtifactStagingDirectory)\$(PowerPlatform.SolutionName).zip        

The configured step should looks as following:

No alt text provided for this image
Configured step Power Platform Pack Solution

2. Configure step Power Platform Import Solution

Service connection

To create a Service connection for this step you'll need Dataverse for Teams environment's details. If you don't have Dataverse for Teams instance in your target team yet, you can provision it by starting create a component (i.e. Canvas App) in the Teams for the particular team and cancel the process when the environment will be ready. In other words - don't create any components manually in the target environment, just provision the Dtaverse for Teams environment.

Once the environment in the target team will be ready - you can create a service connection for this environment as we did in the previous step.

Solution Input File

In this field provide the path:

$(Build.ArtifactStagingDirectory)\$(PowerPlatform.SolutionName).zip        

The complete step should look like the following:

No alt text provided for this image
Configured step Power Platform Import Solution


Don't forget to add a variable with the solution name.

?

Once everything is ready - you can Save and queue this pipeline.

?

When the second pipeline will be finished - you'll see your app (or another component that you created in the development team) in the target (production) team.

No alt text provided for this image
The application has been successfully deployed to the target team


Additional tip

If you want to trigger the second pipeline after the first pipeline completed successfully make the following configuration in the second pipeline:

Open the second pipeline, go to Triggers and check the Enable continuous integration option.

No alt text provided for this image
Configure integration between two pipelines



So, less in an hour you can automate you CI/CD process without any Premium features or additional licenses!

Thanks for reading! ??


#dataverse #teams #azuredevops #powerplatform #cicd

Nikola Ivanov

DevOps Engineer at KPMG IT Service OOD

1 年

I attended the Power Platform Bootcamp. It was great! Lots of useful stuff. Refreshing my memory now:) Thank you for the info, much appreciated!

要查看或添加评论,请登录

Katerina Chernevskaya的更多文章

社区洞察

其他会员也浏览了