Converting a Direct Lake model to Import

Converting a Direct Lake model to Import

One of my favorite features of Tabular Editor (both 2 and 3) is its C# scripting capability, which allows you to run code, see the impact immediately, and even undo the changes made by the script using Ctrl+Z.

I have always been a huge fan of productivity tools that let users write scripts, record macros, or set up automation within the tool. Naturally, I wanted Tabular Editor to have such a feature as well. Over the years, it has become clear why this feature is so valuable.

Just yesterday, a user asked if Tabular Editor could help convert a Direct Lake model to an Import-mode model. This isn't an out-of-the-box feature, as this requirement is quite rare. However, when the need arises, performing this conversion manually would be incredibly cumbersome and tedious—especially if the model is large and contains many tables.

Enter C# scripting! Without further ado, here's the script (click the link below to see the code):

https://gist.github.com/otykier/ac53aac2a2f22838c5b2019fd45aaa98

Copy and paste this script into TE2/TE3's C# script editor and run it on a Direct Lake-mode model. It will convert all Direct Lake partitions to their equivalent Import partitions and reset the model collation to its default value (since Direct Lake models are case-sensitive regarding data).

You can run the script while Tabular Editor is connected to a semantic model through the XMLA endpoint. However, you likely won't be able to save the changes back to the Power BI / Fabric workspace, as in-place conversion of Direct Lake models to Import models is not supported. Instead, you can use the Model > Deploy... option to deploy the newly converted model as a new model in a workspace.

Note: After deploying the newly converted Import-mode model, you will need to specify the credentials for accessing the Lakehouse to refresh data into the model. In the Power BI / Fabric workspace, click the ellipsis button next to the model and choose "Settings":

Then, expand the "Data source credentials" section and click "Edit credentials" to ensure the connection from the model to the Lakehouse is using the correct credentials.

In my case, setting the Authentication method to "OAuth2", choosing "Public" for the Privacy level setting for this data source, and then hitting Sign in, was all that was needed.

And lo and behold, our Import-mode model can now be refreshed:


Erik de Jong

Scalable BI Solutions | Freelance BI Consultant

6 个月

Incredible! Using an ever expanding collection of C# scripts in Tabular Editor when working on a semantic model never stops feeling magical????♂?.

回复
Thomas LeBlanc

Microsoft Fabric Architect | Business Intelligence Architect | Microsoft Data Platform MVP | Power BI Super User | Speaker | Mentor | Technical Business Strategist | Author

7 个月

incredible!!!

回复
Chris Woolderink

Serving "a la carte" DAX dishes or data buffets from a kitchen with Power BI and Azure Data Platform.

7 个月

????That's another awesome usecase for Tabular Editor's C# Scripting functionality. Daniel, thank you for sharing!

要查看或添加评论,请登录

Daniel Otykier的更多文章

  • Data type inspection in semantic models

    Data type inspection in semantic models

    In DAX, we have had the following functions available to inspect the data type of a value, ever since the first version…

    2 条评论
  • Semantic Modeling WITHOUT third-party tools

    Semantic Modeling WITHOUT third-party tools

    It may seem a little odd to find my name on an article that explicitly does not deal with third-party tools for…

    12 条评论
  • Composite Models in Tabular Editor

    Composite Models in Tabular Editor

    Composite Models on Power BI datasets and Analysis Services models went GA in April 2023. This is a powerful, albeit…

    20 条评论
  • The tale of Tabular Editor

    The tale of Tabular Editor

    I never like to brag, but after almost 7 years of developing and maintaining Tabular Editor 2, I feel a blog post with…

    29 条评论

社区洞察

其他会员也浏览了