Read Secret from Azure Key Vault using SecretClient class – Console App C#


Azure Key Vault can save 3 different types of information.

  • Keys – Encryption keys (asymmetric – public/private), can be created in Key Vault or imported, stored in software or HSD
  • Secrets – unstructured text, can be created or imported, stored in the software.
  • Certificates – can be created or imported, contains 3 part – cert metadata, key and secret

Key Vault provides data protection – at rest, in transit, and use.

Key Vault provides Application Security i.e. instead of saving secrets hardcoded in the application, or the configuration files, the secrets can be stored in Key Vault.

Login to Azure Portal

https://portal.azure.com/

Here we have generated a Secret named MyCRMKey inside MyDynamics365KeyVault

We have also provided GetSecret permission to the MyApp application registered in the Azure AD.

Let us create a console app to read the secret.

Add the following NuGet packages to the project.

Get the Vault URI and Directory ID (tenant id)

And the Client Id of the App registered

Sample source code:

We are using SecretClient class here.

Get all the details here

https://azuresdkdocs.blob.core.windows.net/$web/dotnet/Azure.Identity/1.4.0-beta.1/api/index.html

Hope it helps..

Use AzCopy to transfer files from local drive to Azure Blog Storage


AzCopy is a command-line utility that can be used for copying data to and from the storage accounts.

Download the appropriate version of the tool –

https://docs.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-v10#download-azcopy

We’d upload the files to the following container

Below will be the source files

Let us login first (here we are using Azure Active Directory to authorize AzCopy, the other option is using SAS token)

https://docs.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-v10?WT.mc_id=itopstalk-blog-thmaure#authorize-azcopy

It will ask us to open the URL in the browser and enter the code followed by credentials.

After successful sign-in we can close the browser window.

Now let us transfer the directory along with the files inside it using the below syntax

azcopy copy ‘<local-directory-path>’

https://<storage-account-name&gt;.<blob or dfs>.core.windows.net/<container-name>’ – -recursive

in our case,

azcopy copy ‘C:\Customers’ ‘https://storageaccountrg9b58.blob.core.windows.net/mycontainer1‘ – -recursive

Get the URL from the Properties of the container

Make sure the account (service principal) used has the Storage Blob Data Contributor or Storage Blog Data Owner role assigned required for uploading the files.

Run the command.

We can see the folder and files successfully transferred.

Check other posts – 

Transfer files using – Azure Blob Upload task and Premium File transfer task using SSIS Package

https://nishantrana.me/2020/11/24/transfer-files-from-local-drive-to-azure-blob-using-azure-blog-upload-task-ssis/

https://nishantrana.me/2020/11/20/transfer-files-from-local-drive-to-azure-blob-using-premium-file-transfer-task-ssis/

Hope it helps..

Transfer files from local drive to Azure Blob using Azure Blob Upload Task – SSIS


Similar to Premium File Transfer Task,

The Azure Blob Upload Task component can be used to easily transfer files from local drive to Azure Blob storage.

https://docs.microsoft.com/en-us/sql/integration-services/control-flow/azure-blob-upload-task

Let us take a simple example to see it in action.

Here we will pick the folder Customer and its subfolders along with the files inside it and will move it or transfer to the Azure Blob Container.

Create a new SSIS Package and drag the Azure Blob Upload Task to the control flow designer.

Double click the task and specify the following values as shown below

AzureStorageConnection – specify the SSIS Connection Manager for Azure Storage.

Blob Container – the name of the existing blob container

Local Directory – the local directory containing the files to be uploaded.

Search Recursively – specify whether to search for files within Sub-directories.

File Name – specify the pattern for the files to be selected.

Time Range from/to – to pick files modified within that range.

Let us execute the package.

We can see the content transferred successfully to Azure Blog storage

 

Also, check out –

Using Azure Blob Storage component with Dynamics 365

https://nishantrana.me/2020/10/20/using-kingswaysoft-azure-blob-storage-component-with-dynamics-365/

Hope it helps..

Fixed – authorizationpermissionmismatch Azure Blob Storage


We got the below error while trying to transfer files to Azure Blob Storage using AzCopy

INFO: Authentication failed, it is either not correct, or expired, or does not have the correct permission -> github.com/Azure/azure-storage-blob-go/azblob.newStorageError, /home/vsts/go/pkg/mod/github.com/!azure/azure-storage-blob-go@v0.10.1-0.20201022074806-8d8fc11be726/azblob/zc_storage_error.go:42

===== RESPONSE ERROR (ServiceCode=AuthorizationPermissionMismatch) =====

Description=This request is not authorized to perform this operation using this permission.

RequestId:43ee21af-501e-0055-30ef-c07ec3000000

Time:2020-11-22T16:51:42.0459952Z, Details:

   Code: AuthorizationPermissionMismatch

   PUT https://storageaccountrg9b58.blob.core.windows.net/mycontainer1/Customers/CIF1/Sample1.txt?timeout=901

Here we were using Azure Active Directory to provide authorization credentials to AzCopy

https://docs.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-v10#option-1-use-azure-active-directory

The account (service principal) we were using was having the Owner Role

To fix this issue, we assigned the

Storage Blob Data Contributor role to the account.

Retrying again after some time fixed the issue.

Points to keep in mind –

https://docs.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-authorize-azure-active-directory#verify-role-assignments

Hope it helps..

Transfer files from local drive to Azure Blob using Premium File Transfer Task – SSIS


The Premium File Transfer Task component of KingswaySoft can be used to easily transfer files from local drive to Azure Blob storage.

https://www.kingswaysoft.com/products/ssis-productivity-pack/help-manual/premium-file-pack/premium-file-transfer-task

Let us take a simple example to see it in action.

Here we will pick the folder Customers and its subfolders along with the files inside it and will transfer it’s content to the Azure Blob Container.

Create a new SSIS Package and drag the Premium File Transfer Task to the control flow designer.

Double click the task and specify the following Source Properties

  • Action – Send Files – the other options are – Delete files, Create Directory, Remove Directory.
  • Check the option – Include Subdirectories
  • Connection Manager – Local File
  • Directory Path – specify the folder

Similarly, for Destination, we can specify the Azure Blog Storage Connection Manager and the directory path as shown below

Note – Make sure we have already added the connection for it to be available inside the connection manager option of Premium File Transfer Task. The other connection types supported are FTPS, SFT, Amazon S3, Azure Data Lake Storage, Box, Dropbox, Hadoop, OneDrive, SharePoint.

Let us run the package.

We can see the content transferred successfully to Azure Blog storage

 

 

Also, check out –

Using Azure Blob Storage component with Dynamics 365

https://nishantrana.me/2020/10/20/using-kingswaysoft-azure-blob-storage-component-with-dynamics-365/

https://nishantrana.me/2020/10/16/ssis-kingswaysoft-and-dynamics-365/

Hope it helps..

Dynamic Data Masking (DDM) in SQL Server


Through the Dynamic Data Masking feature in SQL Server, we can hide the sensitive data by masking the data from the user who does not have permissions. (Here the data in the database is not changed).

There are 4 different functions to do that –

  • default – Entire column is masked.
  • partial – only works with string, for masking staring and / or ending characters of the column data.
  • email – shows only the first character of the column data and masks the rest.
  • random – only works with numeric, the column data is replaced by random values.

  • e.g. Create Table

  • e.g. Alter Table

  • To find the masking details applied on columns –

  • Mask permissions – 

  • Unmask and mask permission –

Granting UNMASK permission to the user allows to see the unmasked data.

  • For Azure SQL Database, we can enable and specify masking through the interface itself, select the Dynamic Data Masking option for the table, click on the Add mask button

Apply the masking format as needed.

Reference – Pluralsight – SQL Server Course 

Hope it helps..