When I was writing this post, I felt the environment that I set up did not enhance my ability to write code. The reason for this is because I use jsdom, which is a mock object for the HTML component. Because it is not a real object of HTML, my focus has been changed to the test code instead of writing the implementation code.
I ask my friend about how it is hard to implement unit testing in PCF Development (I need to install various NPM Packages to do the testing only. Most of the time because standard HTML API is not there!). Then he replied that Node.Js is not the same as Javascript in HTML. It means that PCF runs in Node.js for development, but we rely on browsers instead of Node.js functions. So my focus is set once again to make it better. We…
Make sure the account (service principal) used has the Storage Blob Data Contributor or Storage Blog Data Owner role assigned required for uploading the files.
Run the command.
We can see the folder and files successfully transferred.
Check other posts –
Transfer files using – Azure Blob Upload task and Premium File transfer task using SSIS Package
We got the below error while trying to transfer files to Azure Blob Storage using AzCopy
INFO: Authentication failed, it is either not correct, or expired, or does not have the correct permission -> github.com/Azure/azure-storage-blob-go/azblob.newStorageError, /home/vsts/go/pkg/mod/github.com/!azure/azure-storage-blob-go@v0.10.1-0.20201022074806-8d8fc11be726/azblob/zc_storage_error.go:42
Here we will pick the folder Customers and its subfolders along with the files inside it and will transfer it’s content to the Azure Blob Container.
Create a new SSIS Package and drag the Premium File Transfer Task to the control flow designer.
Double click the task and specify the following Source Properties
Action – Send Files – the other options are – Delete files, Create Directory, Remove Directory.
Check the option – Include Subdirectories
Connection Manager – Local File
Directory Path – specify the folder
Similarly, for Destination, we can specify the Azure Blog Storage Connection Manager and the directory path as shown below
Note – Make sure we have already added the connection for it to be available inside the connection manager option of Premium File Transfer Task. The other connection types supported are FTPS, SFT, Amazon S3, Azure Data Lake Storage, Box, Dropbox, Hadoop, OneDrive, SharePoint.
Let us run the package.
We can see the content transferred successfully to Azure Blog storage
Also, check out –
Using Azure Blob Storage component with Dynamics 365