Project Grasshopper: DIY Pipelines with PowerShell and PAC CLI

Project Grasshopper: DIY Pipelines with PowerShell and PAC CLI

Last week, I shared my post, 10 Lessons Learned about DevOps in Power Platform. The real lessons were in the comments, where Agnius Bartninkas (whose Power Automate Desktop framework is the real deal) made the gentle suggestion that my environment strategy was not what he would recommend in most cases, if ever. But if I insisted on continuing with my setup, he added, there were two ways forward: Azure DevOps Pipelines (instead of Power Platform Pipelines); or PAC CLI “with some PowerShell magic.”

Magic, you say?

Let’s do it.

To recap: A simple and effective three-environment strategy consists of a single solution moving from Dev to Test and Prod using Power Platform Pipelines. First, you make changes in Dev. Then, get someone to try those changes in Test. Finally, deploy to Prod. Me, I want at least three development sandbox environments, with the option to spin up more, all linked to the same base layer and apps.

The idea is to have a set of base layer database tables powering different apps and different websites for different clients. For example, with my QR Cards program for destination marketing organizations, the base layer handling content management for destinations and activities can support one app for QR Card generation and a separate app for generative itinerary suggestions. These two apps don’t need to live together in the same environment, nor should they have to share the same website. ?

And someone might come along and say, “Hey, we love the product, but can we build a different website experience?” Or “Can we change the business logic for creating QR Cards?”

I want to say yes to those requests: “Yes, you can build a different website experience, and you can change the business logic, because you’ll have your own low-code development environment.” Do it yourself.

What’s the point of a low code environment unless everybody gets to play?

Here’s a sketch of my approach:

  • Start with 3x Dev sandboxes: 1 for Base Layer, 1 for Apps, 1 for Sites.
  • 1 Test environment
  • 1 Prod environment
  • Power Platform Pipelines from Dev to Test and Prod.
  • Automated export/import to move unmanaged solutions to managed solutions in downstream Dev environments.
  • Branch either with new Apps + Sites environment, or only new Sites environment. Each new branch gets new Test and Prod environments.

?

diagram of DIY pipelines

The hard part is keeping the versions synchronized between Dev environments.

In a simple application lifecycle management (ALM) strategy, a change in Dev needs only two deployments: one to Test and one to Prod.

With my approach, any change also requires deployments to every downstream development environment. When you make a change to an App, you need to update one or more Sites environments. And if you make a change to the Base Layer, you need to update everything to the right – all the Apps, all the Sites, all at once.

pac 4ever, man!

I fired up VS Code with Github Copilot and started experimenting with Microsoft Power Platform CLI, formerly PowerApps CLI, invoked with “pac” at the command line.

It really is magic. Instead of going to Power Apps, finding your solution, going to the menu and selecting Export, and then waiting for a link to download a file through your browser, you can type a one-line command that puts your <solution>.zip file wherever you like.

So how about a simple export/import script that says: “first, download solutions a, b, c from environments x, y, z; and then upload them where specified”?

About 1,300 lines of (AI-generated, human-reviewed) code later, that’s exactly what I wrote.

It ain’t perfect, but it gets the job done.

NOTES FROM PROJECT GRASSHOPPER

Day One (Wednesday): do not attempt

  • The obvious approach is to write a script with a list of pac auth create and pac solution export commands, followed by another script with pac solution import. But wait! To authenticate, I need usernames and passwords. And I don’t want to store those in a script. That would be bad.
  • How about an Azure key vault? Super! I’ll put the pac auth create username and password in a key vault. Log into Azure once, retrieve the secrets, and automate from there.
  • Did someone say automation? How about some Bicep scripts? Bicep is the way to go for creating Azure resources. One Bicep script to create a resource group, and another Bicep script to create a service principal and a key vault accessible by that service principal. Oh, and I’ll need a storage account in the resource group, so let’s add that with permissions set for the service principal. And in the script, I’ll generate matching unique suffixes so that we know that they’re paired up.
  • Next, a nifty store-credentials.ps1 script that prompts the user for credentials and then stores them as an Azure Key Vault secret. (Note: GitGuardian does not approve.)
  • Wait a second – the service principal needs a password, which means that I’m back to the bad idea of storing secret strings locally. Sure, I could have a Key Vault with a password to the other Key Vault, but that’s turtles all the way down.
  • I need to get with the times: managed identities!


Bicep does this.

Day Two (Thursday): one step forward, two steps back

  • And the bicep scripts were rewritten, and lo and behold, I can programmatically create a resource group containing a managed identity with controlled access to any number of paired key vaults and storage containers.
  • Now let’s try sending my credentials straight into an Azure Key Vault from the VS Code command line. What? “Failed to connect to MSI. Please make sure MSI is configured correctly and check the network connection.” Okay, it turns out you can’t get there from here. I guess that’s what managed identity means – if it’s just me sitting here outside the wall sending messages from an IDE terminal, that’s not the definition of secure, I guess.
  • What’s inside the wall? How do I execute PowerShell scripts inside of Azure in a way that lets me use managed identities?
  • Here’s one possible answer: Azure Cloud Shell. You can set it up with a persistent storage account to hold all your scripts. I worked with it for a while but then realized even if I got it to work, I would have to learn vi or emacs again and life is too short.
  • Another answer is Azure Automation. This lets you set up process automation for runbooks written in PowerShell or Python, track jobs for a runbook, assign workers to jobs, and keep track of the status of each job. And you can run it in Azure or local environments, and much more. But I haven’t heard any of the Power Platform experts talking about this one, so I’ll mark that as a “maybe.”


shell is for children

Day Three (Friday): hello it’s me me me

  • Time to wave my hands and assume that I’ll figure out authentication somehow, so that I can turn my attention to the export script. And what else to call it but simple-export; later expanded to simple-export-import.
  • Configuration. Created a JSON-format configuration file with a list of environments including unmanaged solutions to be exported and managed solutions to be imported from elsewhere.
  • Export unmanaged solutions. For each environment, loop through solutions and download anything newer than what I already have.
  • The resulting exports folder has subfolders for each environment, solution, and version (major, minor, build, and revision).
  • Import managed solutions. Loop through environments. For each managed solution to be imported, look in exports under the parent environment folder for the latest version. Use pac solution import with the --skip-lower-version flag, letting pac handle the version logic for the return trip.
  • The main script loads the configuration file, checks for pac cli, and then runs export, import, or both.
  • Test. It’s alive!
  • The main problem is that sitting here out in the wild using VS Code terminal, I need to authenticate manually – once for each environment switch. So, six times. For now.

Day Four (Monday): may I see your badge again please

  • Try to figure out what I did last week.
  • Read up on authentication.
  • Study for SC-900 certification exam on Security, Compliance, and Identity Fundamentals, which I'll be taking next week.

Day Five (Tuesday): what have I done

  • Watch Power Platform ALM videos by Luise Freese and Benedikt Bergmann talking about Azure DevOps.
  • Write a script to count the lines of code in this project.
  • Deploy my science experiment (“Project Grasshopper”) to Github.
  • Write this blog post.

Closing thoughts

If I had to do it again, I would start with Azure DevOps.

But I’m glad I took the time to work on this project, for a few reasons:

DevOps appreciation. There’s nothing like a DIY project to motivate your interest in using professional tooling. ?

Learned a new mode of interactive programming. In the past, we had REPL, the Read-Evaluate-Print Loop. Now, we have CAT PEE:

·??????? Copilot: Ask AI to write code for you.

·??????? Accept: Whatever it writes, that looks good.

·??????? Test: Let’s see if it works.

·??????? Paste Error: Ask AI to figure out the cryptic errors and just fix whatever’s not working.

·??????? Explain: Oh ok, fine, I’ll figure it out myself, just tell me what you wrote earlier.

Ideal science project. It’s not like I have any paying clients for my app yet. But when I do, now they can build whatever they want on top of my solutions!

Skills showcase. Speaking of paying clients, I’ve spent the last year learning Power Platform and so far all I have to show for it are a couple of sweaty #PPCC24 Power Platform t-shirts that I use at the gym. (Oh, and a trophy!)

Beyond building out and commercializing my QR Cards project, I’m looking to work on a professional team in some capacity, to learn from the pros while getting hands-on with business challenges. I speak MBA, I’ve worked with top consulting companies, I know Power Platform and a growing number of Azure capabilities, and look! Now I’m this close to being an ALM pro. I'm particularly interested in working with Dynamics 365, because that’s where it all comes together.

If you have any ideas about a potential fit, let’s talk.


?

Agnius Bartninkas

Operational Excellence and Automation Consultant | Power Platform Solution Architect | Microsoft Biz Apps MVP | Speaker | Author of PADFramework

1 个月

Great job. I honestly did not expect you would actually go all the way and try this. Especially when you could have gone with Azure DevOps. But this must have been a great learning experience. Also loving the CAT PEE approach! You should absolutely coin that!

要查看或添加评论,请登录

Ivan Schneider的更多文章

  • Mountaineering

    Mountaineering

    I just made it through Chapter 12 of Test-Driven Development in Python, which means that there's a non-zero chance that…

    2 条评论
  • I Don't Belong Here

    I Don't Belong Here

    Cole Porter’s “So In Love” was new to me but “Manha de Carnaval (Black Orpheus)” by Jobim, that one I know very well…

    1 条评论
  • From Low Code to Slow Code: How I learned TDD from LFL

    From Low Code to Slow Code: How I learned TDD from LFL

    If my feed is to be believed, you can tell a chatbot, “Build me an app!” and have it up and running within seconds…

    8 条评论
  • Automating Document Summarization with Power Automate Desktop and PowerShell

    Automating Document Summarization with Power Automate Desktop and PowerShell

    What if AI summarized your entire hard drive? If you’re anything like me, you have folders upon folders inside of more…

    7 条评论
  • Elevator Pitch

    Elevator Pitch

    If AI is everywhere, what might we expect in the elevator lobbies of the near future? Mix-and-match from among the…

  • Much ADO: Trying Azure DevOps for Power Platform Solution Migration

    Much ADO: Trying Azure DevOps for Power Platform Solution Migration

    To wrap up my DevOps trilogy, I took some time to figure out how to get started with Power Platform Build Tools in…

  • 10 Lessons Learned About DevOps in Power Platform

    10 Lessons Learned About DevOps in Power Platform

    Environments, solutions, and pipelines, oh my! Over the last week, I’ve migrated my QR Cards app from an all-in-one…

    10 条评论
  • Re: Solutions

    Re: Solutions

    Which startup business model makes the most sense? Spend one year building something, and the next year selling it…

  • Happy Holidays!

    Happy Holidays!

    I wrote some Python scripts to go through my daily journals and extract tagged entries and inline tags. ElfGPT, take it…

  • Pennies from Redmond

    Pennies from Redmond

    In this issue: Thinking about an autonomous agent for War and Peace; a concise statement of The Low-Code Builder's…

社区洞察

其他会员也浏览了