Creating Better PCF Component – Part 2


temmyraharjo's avatarTemmy Wahyu Raharjo

When I was writing this post, I felt the environment that I set up did not enhance my ability to write code. The reason for this is because I use jsdom, which is a mock object for the HTML component. Because it is not a real object of HTML, my focus has been changed to the test code instead of writing the implementation code. 

I ask my friend about how it is hard to implement unit testing in PCF Development (I need to install various NPM Packages to do the testing only. Most of the time because standard HTML API is not there!). Then he replied that Node.Js is not the same as Javascript in HTML. It means that PCF runs in Node.js for development, but we rely on browsers instead of Node.js functions. So my focus is set once again to make it better. We…

View original post 649 more words

How to – Use AzCopy to transfer files from local drive to Azure Blog Storage


AzCopy is a command-line utility that can be used for copying data to and from the storage accounts.

Download the appropriate version of the tool –

https://docs.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-v10#download-azcopy

We’d upload the files to the following container

Below will be the source files

Let us login first (here we are using Azure Active Directory to authorize AzCopy, the other option is using SAS token)

https://docs.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-v10?WT.mc_id=itopstalk-blog-thmaure#authorize-azcopy

It will ask us to open the URL in the browser and enter the code followed by credentials.

After successful sign-in we can close the browser window.

Now let us transfer the directory along with the files inside it using the below syntax

azcopy copy ‘<local-directory-path>’

https://<storage-account-name&gt;.<blob or dfs>.core.windows.net/<container-name>’ – -recursive

in our case,

azcopy copy ‘C:\Customers’ ‘https://storageaccountrg9b58.blob.core.windows.net/mycontainer1‘ – -recursive

Get the URL from the Properties of the container

Make sure the account (service principal) used has the Storage Blob Data Contributor or Storage Blog Data Owner role assigned required for uploading the files.

Run the command.

We can see the folder and files successfully transferred.

Check other posts – 

Transfer files using – Azure Blob Upload task and Premium File transfer task using SSIS Package

https://nishantrana.me/2020/11/24/transfer-files-from-local-drive-to-azure-blob-using-azure-blog-upload-task-ssis/

https://nishantrana.me/2020/11/20/transfer-files-from-local-drive-to-azure-blob-using-premium-file-transfer-task-ssis/

Hope it helps..

Advertisements

Transfer files from local drive to Azure Blob using Azure Blob Upload Task – SSIS


Similar to Premium File Transfer Task,

The Azure Blob Upload Task component can be used to easily transfer files from local drive to Azure Blob storage.

https://docs.microsoft.com/en-us/sql/integration-services/control-flow/azure-blob-upload-task

Let us take a simple example to see it in action.

Here we will pick the folder Customer and its subfolders along with the files inside it and will move it or transfer to the Azure Blob Container.

Create a new SSIS Package and drag the Azure Blob Upload Task to the control flow designer.

Double click the task and specify the following values as shown below

AzureStorageConnection – specify the SSIS Connection Manager for Azure Storage.

Blob Container – the name of the existing blob container

Local Directory – the local directory containing the files to be uploaded.

Search Recursively – specify whether to search for files within Sub-directories.

File Name – specify the pattern for the files to be selected.

Time Range from/to – to pick files modified within that range.

Let us execute the package.

We can see the content transferred successfully to Azure Blog storage

 

Also, check out –

Using Azure Blob Storage component with Dynamics 365

https://nishantrana.me/2020/10/20/using-kingswaysoft-azure-blob-storage-component-with-dynamics-365/

Hope it helps..

For SharedAccessSignature -specify following details

Account Name – Storage Account Name, Name of blob in the Blob Path.

Select appropriate permissions and Generate SAS Token for the blob, copy and paste the Blob SAS Token in the Token in the connection editor above.

Test Connection –

Fixed – authorizationpermissionmismatch Azure Blob Storage


We got the below error while trying to transfer files to Azure Blob Storage using AzCopy

INFO: Authentication failed, it is either not correct, or expired, or does not have the correct permission -> github.com/Azure/azure-storage-blob-go/azblob.newStorageError, /home/vsts/go/pkg/mod/github.com/!azure/azure-storage-blob-go@v0.10.1-0.20201022074806-8d8fc11be726/azblob/zc_storage_error.go:42

===== RESPONSE ERROR (ServiceCode=AuthorizationPermissionMismatch) =====

Description=This request is not authorized to perform this operation using this permission.

RequestId:43ee21af-501e-0055-30ef-c07ec3000000

Time:2020-11-22T16:51:42.0459952Z, Details:

   Code: AuthorizationPermissionMismatch

   PUT https://storageaccountrg9b58.blob.core.windows.net/mycontainer1/Customers/CIF1/Sample1.txt?timeout=901

Here we were using Azure Active Directory to provide authorization credentials to AzCopy

https://docs.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-v10#option-1-use-azure-active-directory

The account (service principal) we were using was having the Owner Role

To fix this issue, we assigned the

Storage Blob Data Contributor role to the account.

Retrying again after some time fixed the issue.

Points to keep in mind –

https://docs.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-authorize-azure-active-directory#verify-role-assignments

Hope it helps..

Advertisements

Transfer files from local drive to Azure Blob using Premium File Transfer Task – SSIS


The Premium File Transfer Task component of KingswaySoft can be used to easily transfer files from local drive to Azure Blob storage.

https://www.kingswaysoft.com/products/ssis-productivity-pack/help-manual/premium-file-pack/premium-file-transfer-task

Let us take a simple example to see it in action.

Here we will pick the folder Customers and its subfolders along with the files inside it and will transfer it’s content to the Azure Blob Container.

Create a new SSIS Package and drag the Premium File Transfer Task to the control flow designer.

Double click the task and specify the following Source Properties

  • Action – Send Files – the other options are – Delete files, Create Directory, Remove Directory.
  • Check the option – Include Subdirectories
  • Connection Manager – Local File
  • Directory Path – specify the folder

Similarly, for Destination, we can specify the Azure Blog Storage Connection Manager and the directory path as shown below

Note – Make sure we have already added the connection for it to be available inside the connection manager option of Premium File Transfer Task. The other connection types supported are FTPS, SFT, Amazon S3, Azure Data Lake Storage, Box, Dropbox, Hadoop, OneDrive, SharePoint.

Let us run the package.

We can see the content transferred successfully to Azure Blog storage

 

 

Also, check out –

Using Azure Blob Storage component with Dynamics 365

https://nishantrana.me/2020/10/20/using-kingswaysoft-azure-blob-storage-component-with-dynamics-365/

https://nishantrana.me/2020/10/16/ssis-kingswaysoft-and-dynamics-365/

Hope it helps..

Fixed – Missing app.config (Application Configuration File) option in Visual Studio


While writing a console application we realized that the the option to add app.config was missing in Visual Studio 2019

We tried couple of things as suggested over the internet but to no avail.

Eventually, install / modify the .NET desktop development workload fixed it for us.

Application configuration file available  –

Hope it helps..

Advertisements

Nishant Rana's Weblog

Everything related to Microsoft .NET Technology

Skip to content ↓