How to access D365FO Azure Storage when EnableSharingOfValidStorageConnectionString flight is disabled

How to access D365FO Azure Storage when EnableSharingOfValidStorageConnectionString flight is disabled

Microsoft Dynamics 365 Finance and Operations (D365FO) uses an internal Azure Storage account as the default storage for various standard functionalities like:

? Attachments

? Document templates

? File download

? Printing via DRA

? etc.

It can also be used from custom code wherever it needs some file storage (permanent or temporary) as you can rely that it always exists on all environments.

Microsoft is deprecating the old way of accessing D365FO internal Azure Storage account and introducing a new one. All developers that use internal Azure Storage account in their custom code will need to learn the new approach and its limitations and adjust their custom code accordingly. Unfortunately, Microsoft has not yet published the documentation for the new API, nor the guidelines for transition. That was the motivation for me to do some research and publish my findings in this article.

NOTE: The article presents the author's understanding based on public sources, discussions with MS engineers and his own research and reverse engineering. It is possible that some details might not be completely correct (and can also change over time), but most of the presented facts are correct and constitute a valid big picture.

Old way (use connection string)

The old way of working with internal storage account in custom code is to get its connection string and to use the Azure Storage SDK libraries (Azure.Storage.* or old Microsoft.Azure.Storage.*) to access the storage account.

using Microsoft.Dynamics.Clx.ServicesWrapper;
using Microsoft.DynamicsOnline.Infrastructure.Components;
str connectionString = CludInfrastructure::GetCsuStorageConnectionString(); // or
str connectionString2 = SharedServiceUnitStorage::GetDefaultStorageContext().StorageConnectionString;        

This connection string contains a valid account key which serves for authentication. Anyone with such connection string can get full access to the data in this storage account (data plane). This article explains how you can Connect to D365FO Azure Storage from MS Azure Storage Explorer.

Sample Azure Storage connection string:

DefaultEndpointsProtocol=https;AccountName=prtnrsndbx1lmlvjz3t;AccountKey=zUCfhp/hFjIkjMj7C94ABCDwyWH0D2liqhvi9uRxrE4L/qOUSbh81MN5chE3QjlIOzAEy0WMxE6A+AStbEkYFA==;EndpointSuffix=core.windows.net (some parts are invalidated for security reason)

Connection string access replaced by managed identity access

Azure team is discouraging the use of account key to access storage account, because it gives full permissions to anyone that knows it. One storage account can have just two account keys, which is useful for rotation, but you cannot have different keys for each user.

To overcome this limitation, role-based access control (RBAC) is a more secure alternative that allows fine grained permissions and named access. It means that access to specific objects (storage accounts, containers, etc.) are granted by roles assigned to these objects (or parent objects) for Entra ID users and applications.

Because storage account key access is discouraged you can disable it on your storage accounts. As I understand Microsoft will soon disable it on D365FO internal Azure storage account.

NOTE: Changing the settings of internal Azure storage account requires the access to its management plan. For D365FO internal Azure storage account it was always reserved just for Microsoft (except CHE and local devbox environments).

Storage account settings

Managed identity access will be used instead of account key access. As I understand it will be a managed identity of an Azure container or virtual machine when runs additional service that take care of authentication. Only Microsoft internal code should be able to access this service and they treat these details as internal, do not reveal them and can change them without notice.

All new code required for this transition is already in 10.0.42 (some parts already in 10.0.41), but it is flight controlled. As we cannot control flights on Tier2+ environments it means that even though your connection string-based code works now, it will just stop working some day in near future on some environments without further notice.

The timeline for this transition is not clearly announced, but as I understood most of the tenants will be transitioned in first half of 2025. It should first apply to Sandbox environments and shortly afterwards to Prod. There will be a transition period in which the customer can report a problem and MS support can temporarily revert it to legacy (connection string based) access. Currently there is no way to simulate it to test that your code will work when managed identity access will be enforced.

Sharing links

The sharing links can be signed with an account key or user delegation key. The latter is more secure as the system validates the permissions of the key creator at every access and allows easier revocation.

Account SAS URL (no hard limit on expiration):

https://srpeap63.blob.core.windows.net/test?sp=r&st=2024-12-09T22:57:46Z&se=2024-12-10T06:57:46Z&spr=https&sv=2022-11-02&sr=c&sig=BrSbUjBuokivOICyR9mTqjA1eEK9pURGzAIYSn8SKgU%3D

User delegation SAS URL (for security reasons max expiration date is 7 days): https://srpeap63.blob.core.windows.net/test?sp=r&st=2024-12-09T22:57:46Z&se=2024-12-10T06:57:46Z&skoid=313328fd-f8ac-4dcb-926f-dcff6db26bd8&sktid=c61cd524-b8a8-49b4-8460-1f2084055f1d&skt=2024-12-09T22:57:46Z&ske=2024-12-10T06:57:46Z&sks=b&skv=2022-11-02&spr=https&sv=2022-11-02&sr=c&sig=pOax3XTW1DrwR%2BiLOZicD0WIJXz%2FNaab2mZYuSuyzng%3D

With transition to managed identity access the API will return user delegation SAS sharing links and old account SAS link with not work anymore. Some months after this transition MS also plans to block public access to internal Azure storage account on the network level. It will only be available from AOS. Additional proxy service will be provided to expose files/containers publicly. As I understand the API will return these new sharing links (different hostname and path, but same parameters).

New way (SharedServiceUnitStorage API)

The following API will handle managed identity access (by inserting authentication header) when being called from D365FO. It is not documented, but you can find examples in standard code. The API is not new, but now you will be forced to use it which was not the case before.

Some of these methods return a connection string or take one as an input parameter. After the transition to managed identity access this connection string will contain an invalid account key. Such connection string cannot be used for access outside D365FO, but can be used with this D365FO API as the platform takes care of authentication

SharedServiceUnitStorage API (Microsoft.DynamicsOnline.Infrastructure.Components):

? Category based CRUD operations for files – tracked and deleted after expiration (UploadData(), GetData(), DownloadData(), DeleteData(), GetCategorySasKey(), RevokeCategoryPolicy(), GetDataInCategory())

? Container based operations - (GetContainerLink(), GetFileLink())

? CloudStorageAccount / BlobServiceClient / TableServiceClient – raw access through Azure SDK clients, but with D365FO authentication (GetSharedServiceUnitStorageAccount(), GetSharedServiceBlobServiceClient(), GetSharedServiceTableServiceClient())

SharedServiceUnitStorage object is constructed as shown below. It will work in the future, just note that storageContext.StorageConnectionString might contain an invalid connection string.

StorageContext storageContext = SharedServiceUnitStorage::GetDefaultStorageContext();
var blobStorageService = new SharedServiceUnitStorage(storageContext);        

Category based CRUD operations for files

Category is a logical unit for storing files (BLOBs or data) on the storage (BLOB storage on cloud and filesystem on on-prem environments). In cloud files are usually stored in container with the same name as category, but for each file there is also record in ShareStorageDetails table on storage account. It contains data about retention (permanent, temporary), expiration and accessibility (private, public). D365FO automatically deletes expired temporary files.

Code sample (file upload, download, delete, sharing link, list files):

StorageContext storageContext = SharedServiceUnitStorage::GetDefaultStorageContext();
SharedServiceUnitStorage blobStorageService = new SharedServiceUnitStorage(storageContext);            
                
var blobInfo = new SharedServiceUnitStorageData();
blobInfo.Id = guid2Str(newGuid());
blobInfo.Category = 'test';
blobInfo.Name = blobInfo.Id + '/MyFile.txt';
blobInfo.Accessibility = Accessibility::Private;
blobInfo.Retention = Retention::Temporary;
blobInfo.ExpirationDuration = System.TimeSpan::FromDays(7);

var encoding = System.Text.Encoding::UTF8;
str inputStr = "Sample file content.";
MemoryStream inputStream = new MemoryStream(encoding.GetBytes(inputStr));

blobStorageService.UploadData(blobInfo, inputStream);
info('UploadData succeeded.');

SharedServiceUnitStorageData blobInfo2 = blobStorageService.GetData(blobInfo.Id, blobInfo.Category, BlobUrlPermission::Read, System.TimeSpan::FromMinutes(10));
info('GetData.BlobLink = ' + blobInfo2.BlobLink);
Browser br = new Browser();
br.navigate(blobInfo2.BlobLink);

// container based method GetFileLink(str containerName, str fileName, ...) returns the same result
str fileLink  = blobStorageService.GetFileLink(blobInfo.Category, blobInfo.Name, BlobUrlPermission::Read, System.TimeSpan::FromMinutes(10), false, false);
str fileLink2 = blobStorageService.GetFileLink(blobInfo.Category, blobInfo.Name, BlobUrlPermission::Read, System.TimeSpan::FromMinutes(10), false, true); // invokedExternally = true
info('GetFileLink = ' + fileLink);
info('GetFileLink external = ' + fileLink2);
DocGlobalHelper::assert(blobInfo2.BlobLink == fileLink);
DocGlobalHelper::assert(fileLink == fileLink2);

MemoryStream outputStream = new MemoryStream();
blobStorageService.DownloadData(blobInfo.Id, blobInfo.Category, outputStream);
str outputStr = encoding.GetString(outputStream.ToArray());
info('DownloadData outputStr = ' + outputStr);
DocGlobalHelper::assert(inputStr == outputStr);

// list all files in category
var list = blobStorageService.GetDataInCategory(blobInfo.Category, BlobUrlPermission::Read, System.TimeSpan::FromMinutes(10), false);
info('GetDataInCategory succeeded.');
System.Collections.IEnumerator e = list.GetEnumerator();
while(e.MoveNext())
{
    info(e.Current);    // SharedServiceUnitStorageData.BlobLink with default expiration 40min
}

blobStorageService.DeleteData(blobInfo.Id, blobInfo.Category);

----
var blobInfo = new SharedServiceUnitStorageData();
blobInfo.Id = fileId;
blobInfo.Category = "projectclient-files";
blobInfo.Name = fileName;
blobInfo.Accessibility = Accessibility::Private;
blobInfo.Retention = _isTemporary ? Retention::Temporary : Retention::Permanent;
blobInfo.ExpirationDuration = System.TimeSpan::FromDays(7);

var blobStorageService = new SharedServiceUnitStorage(SharedServiceUnitStorage::GetDefaultStorageContext());
blobStorageService.UploadData(blobInfo, _stream);        

Get file download link:

SharedServiceUnitNotFoundException sharedServiceUnitNotFoundException;
var blobStorageService = new SharedServiceUnitStorage(SharedServiceUnitStorage::GetDefaultStorageContext());
var strategy = new FileUploadTemporaryStorageStrategy();
SharedServiceUnitStorageData uploadedBlobInfo;
try
{
    uploadedBlobInfo = blobStorageService.GetData(
        _uploadFileID,
        FileUploadTemporaryStorageStrategy::AzureStorageCategory,
        BlobUrlPermission::Read,
        System.TimeSpan::FromMinutes(strategy.getBlobLinkExpirationTimeSpanInMinutes()),
        false);
    uploadFileURL = uploadedBlobInfo.BlobLink;
}
catch (sharedServiceUnitNotFoundException)
{ … }        

If you use the same file name the file will be overwritten.

ShareStorageDetails table
BLOB (file) in a container

Files stored in category contain additional metadata in SharedStorageDetails table. If the file with the same name will be uploaded multiple times, only the last version will be stored, but each upload will contain a separate details record.

Container based operations

SharedServiceUnitStorage contains two methods to get a sharing link for container or file in container. Note that after transition to managed identity access it will be user delegation SAS link and account SAS link before it:

  • string GetContainerLink(string containerName, bool isContainerPublic) – returns sharing link with max validity (7 days)
  • string GetFileLink(string containerName,

string fileName,

BlobUrlPermission permission, /*Read, Write, NoPermission*/

TimeSpan expirationDuration,

bool useContentDisposition,

bool invokedExternally)

CloudStorageAccount / BlobServiceClient / TableServiceClient

If you need a more advanced low-level methods you can still use client classes from Azure SDK, but you need to instantiate them through SharedServiceUnitStorage methods below in order to let the platform to handle the authentication.

? public CloudStorageAccount GetSharedServiceUnitStorageAccount()

public CloudStorageAccount FetchSharedServiceUnitStorageAccount()

? public BlobServiceClient GetSharedServiceBlobServiceClient()

? public TableServiceClient GetSharedServiceTableServiceClient()

Code sample:

StorageContext storageContext = SharedServiceUnitStorage::GetDefaultStorageContext();
SharedServiceUnitStorage blobStorageService = new SharedServiceUnitStorage(storageContext);            
                
BlobServiceClient blobServiceClient = blobStorageService.GetSharedServiceBlobServiceClient();
BlobContainerClient blobContainer = blobServiceClient.GetBlobContainerClient('Test');
BlobClient blob = blobContainer.GetBlobClient('MyFile.txt');
// It is hard to work with BlobServiceClient in X++ as most methods has parameter types with generics. Where possible call it from C#.        

Summary

As we have seen low level access to D365FO internal Azure Storage account will be changed dramatically. If you have been using it with Azure Storage Explorer, long lasting sharing links or file shares will not be possible anymore. Your custom code needs to be validated to be in line with new rules and API. But here comes the catch, you cannot force the transition (neither revert it) on your test environment for testing purposes. MS will do the transition with their pace tenant by tenant, so it can happen that the customer will get it before its partner. Best you can do is to use presented standard API and hope for the best. MS will first do transition to managed identity access, then disable account key access and later they will move storage account to private network. In case of problems ask MS support to temporarily revert to legacy access.

Even though MS engineers have done a good job with redesigning and securing the storage access and even though they have put certain steps under flight control, the whole transition is still not testable and API documentation is missing. This leaves all others in uncertainty and I hope it will not become a pattern for the future.


Fortunately without knowing of the upcoming deprecation of functionality, I've recently developed a solution that already uses the new approach with SAS. Good article, thank you.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了