In Dynamics 365 for Finance and Operations Azure storage is used to store files for Attachments based on Document types with Azure storage selected for the Location field, handle temporary files including EMF files sent to DRAs when printing reports to Network printers, etc. Additionally, Docentric offers an option to store your Docentric designs (aka templates) on Azure blob storage but also to print reports to Azure storage by specifying a particular container and blob path on the Print destination settings form.
We introduced saving reports to Azure storage as a replacement for printing reports to File system but also as a convenient way for integrations with external systems.
A question that pops up is how a user can access to the reports printed to Azure storage from D365FO, browse them, download, delete, etc. Luckily, there is a free tool, Microsoft Azure Storage Explorer, which is a standalone app that reminds on Windows Explorer and makes working with Azure storage data pretty easy.
MS Azure Storage Explorer
You can download Microsoft Azure Storage Explorer from here and check the Microsoft’s Get started with Storage Explorer tutorial.
Connect to D365FO Azure storage account from MS Azure Storage Explorer
This tutorial will show you two ways of connecting to the Azure storage account that is used with D365FO from MS Azure Storage Explorer:
- Option 1: Connect by using your business Microsoft account,
- Option 2: Connect by using D365FO connection string.
Option 1: Connect by using your business Microsoft account
One way to connect to the D365FO Azure storage from Microsoft Azure Storage Explorer is by using your business Microsoft account, under condition that you have been granted this access.
When you download and install Microsoft Azure Storage Explorer and run if for the first time, you have to choose the way how to connect to the Azure storage. The image below shows the default settings, which offer the option to connect to the storage account or service. We can just leave it like this and click Next.
We now have to sign into our Microsoft Azure account, as shown on the images below.
After successful login you select resources from your subscriptions that you want to show in the explorer. In the case below, only Microsoft Partner Network is selected.
And that’s it! You should be able to see your Azure storage resources and browse them. The usage of Microsoft Azure Storage Explorer is beyond the scope of this tutorial, but you’ll find quite a bit of tutorials on the Internet.
Option 2: Connect by using D365FO connection string
If you can’t use Option 1 described above, there is also another way – by using the same connection string that D365FO uses to connect to Azure storage. Just follow the instructions below. Please note that you should consult with your system administrator first, whether this breaks any company security policies that might apply.
Obtaining D365FO connection string
If you are using Docentric AX 3.3.5 or above, you can get the connection string by entering this URL in your web browser:
<YourDomain>/?cmp=usmf&mi=SysClassRunner&cls=DocAzureBlobHelper
For example, if you are using OneBox on-premises, then your URL will look like this:
https://usnconeboxax1aos.cloud.onebox.dynamics.com/?cmp=usmf&mi=SysClassRunner&cls=DocAzureBlobHelper
If you use older versions of Docentric or you are not using Docentric at all :), you can create a runnable class with the following code snippet:
1 2 3 4 5 6 |
using Microsoft.Dynamics.Clx.ServicesWrapper; public static void main(Args _args) { info('CsuStorageConnectionString = ' + CloudInfrastructure::GetCsuStorageConnectionString()); info('CsuClientCertificateThumbprint = ' + CloudInfrastructure::GetCsuClientCertificateThumbprint()); } |
In the second info line you will see the connection string, which appears after the first '=' character. Copy it to connect to your Azure storage from MS Azure Storage Explorer.
Connecting to Azure Storage with the connection string
With connection string copied we can start Microsoft Azure Storage Explorer again and select that we want to connect by using a connection string and click Next. In the next window we enter some value in the Display name filed and paste our connection string into the Connection string field.
In the next window we see the Connection Summary information. If this is what we want, we click the Connect button.
Now we can browse and download our Azure Storage resources, e.g. our reports printed from D365FO using MS Azure Storage Explorer.
This doesnt work on cloud version and connection string comes as Empty
Hi Baseer,
This should work on cloud version. Please contact us on support@docentric.com and we will check it out together.
would this work if you establish connection from Azure Storage Explorer to a Tier-2 or even Production environment?
Yes, it works. However you have to take care not to corrupt the data on Azure Blob storage.
If I download a file from document container via Storage Explorer, then delete it in cloud storage and upload back (via Storage Explorer), D365Fo doesn’t show it in attached documents (filemanagement link with D365 generated display?access_token). With Azure SAS token I can see the file as before. Do you know, what is the reason?
I have managed to delete and upload back the file in Storage Explorer and see it in D365FO Attachments. But the name (GUID) has to be identical and also the metadata (especially FullFileName that contains base64 encoded file name).
Is there a way to connect your Dev (CHE) or Test (sandbox) environment to the Production azure storage location so that you can pull down production files from your dev or test box.
Currently it just errors because dev and test boxes don’t have access to the production azure storage locations and a copy of the db does not copy the azure storage locations, etc to your dev/test box.
So it shows there is an attachment, but if you try to open it, it fails. Any way to point the dev or test box to the production storage?
The safe approach is to copy documents container from Prod to Dev/Test storage account.
Use:
– DocAzureBlobHelper class to get the connections string of each environment,
– Use Azure Storage Explore > Get Shared Access Signature (SAS) for documents container on each storage,
– use AzCopy tool to copy container content.
You can also set AzureStorage.StorageConnectionString in web.config of your Dev environment, but connecting to Prod storage is however risky.
The following line will give you AzureStorage.StorageConnectionString on environments where you cannot access web.config. In next Docentric version we will add it to DocAzureBlobHelper printout.
System.Configuration.ConfigurationManager::AppSettings.Get(‘AzureStorage.StorageConnectionString’)
An update to my last answer:
If you copy the encoded AzureStorage.StorageConnectionString from one web.config to the other it will not work. It seems it is encrypted with environment specific certificate. But you can put the decoded AzureStorage.StorageConnectionString from other environment into your web.config and it will work.
Regarding copying with AzCopy: The files are correctly copied and it all looks the same, but D365FO does not recognize them. It is not clear what makes a problem. Then I have written an D365FO job to import files from external Azure storage container and it works fine. It can be used to copy attachment from Prod to Dev/Test storage. I can share if with you if you want.
Hi Miha, I’m looking for a way to bring attachments from PRD to tier 2 environments. Please share if you got a solution for that.
Attachments stored on Prod Azure storage can be imported to target environment with a script. It assumes that the target environment has a copy of Prod database, so attachment records are in the database, but the files are missing.
Sample code here. Source container name is always “documents”.
Thank you Miha, I tested it in UAT and it works perfectly. Thank you.