Use AzCopy to transfer files from local drive to Azure Blog Storage


AzCopy is a command-line utility that can be used for copying data to and from the storage accounts.

Download the appropriate version of the tool –

https://docs.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-v10#download-azcopy

We’d upload the files to the following container

Below will be the source files

Let us login first (here we are using Azure Active Directory to authorize AzCopy, the other option is using SAS token)

https://docs.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-v10?WT.mc_id=itopstalk-blog-thmaure#authorize-azcopy

It will ask us to open the URL in the browser and enter the code followed by credentials.

After successful sign-in we can close the browser window.

Now let us transfer the directory along with the files inside it using the below syntax

azcopy copy ‘<local-directory-path>’

https://<storage-account-name&gt;.<blob or dfs>.core.windows.net/<container-name>’ – -recursive

in our case,

azcopy copy ‘C:\Customers’ ‘https://storageaccountrg9b58.blob.core.windows.net/mycontainer1‘ – -recursive

Get the URL from the Properties of the container

Make sure the account (service principal) used has the Storage Blob Data Contributor or Storage Blog Data Owner role assigned required for uploading the files.

Run the command.

We can see the folder and files successfully transferred.

Check other posts – 

Transfer files using – Azure Blob Upload task and Premium File transfer task using SSIS Package

https://nishantrana.me/2020/11/24/transfer-files-from-local-drive-to-azure-blob-using-azure-blog-upload-task-ssis/

https://nishantrana.me/2020/11/20/transfer-files-from-local-drive-to-azure-blob-using-premium-file-transfer-task-ssis/

Hope it helps..

Transfer files from local drive to Azure Blob using Azure Blob Upload Task – SSIS


Similar to Premium File Transfer Task,

The Azure Blob Upload Task component can be used to easily transfer files from local drive to Azure Blob storage.

https://docs.microsoft.com/en-us/sql/integration-services/control-flow/azure-blob-upload-task

Let us take a simple example to see it in action.

Here we will pick the folder Customer and its subfolders along with the files inside it and will move it or transfer to the Azure Blob Container.

Create a new SSIS Package and drag the Azure Blob Upload Task to the control flow designer.

Double click the task and specify the following values as shown below

AzureStorageConnection – specify the SSIS Connection Manager for Azure Storage.

Blob Container – the name of the existing blob container

Local Directory – the local directory containing the files to be uploaded.

Search Recursively – specify whether to search for files within Sub-directories.

File Name – specify the pattern for the files to be selected.

Time Range from/to – to pick files modified within that range.

Let us execute the package.

We can see the content transferred successfully to Azure Blog storage

 

Also, check out –

Using Azure Blob Storage component with Dynamics 365

https://nishantrana.me/2020/10/20/using-kingswaysoft-azure-blob-storage-component-with-dynamics-365/

Hope it helps..

Fixed – authorizationpermissionmismatch Azure Blob Storage


We got the below error while trying to transfer files to Azure Blob Storage using AzCopy

INFO: Authentication failed, it is either not correct, or expired, or does not have the correct permission -> github.com/Azure/azure-storage-blob-go/azblob.newStorageError, /home/vsts/go/pkg/mod/github.com/!azure/azure-storage-blob-go@v0.10.1-0.20201022074806-8d8fc11be726/azblob/zc_storage_error.go:42

===== RESPONSE ERROR (ServiceCode=AuthorizationPermissionMismatch) =====

Description=This request is not authorized to perform this operation using this permission.

RequestId:43ee21af-501e-0055-30ef-c07ec3000000

Time:2020-11-22T16:51:42.0459952Z, Details:

   Code: AuthorizationPermissionMismatch

   PUT https://storageaccountrg9b58.blob.core.windows.net/mycontainer1/Customers/CIF1/Sample1.txt?timeout=901

Here we were using Azure Active Directory to provide authorization credentials to AzCopy

https://docs.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-v10#option-1-use-azure-active-directory

The account (service principal) we were using was having the Owner Role

To fix this issue, we assigned the

Storage Blob Data Contributor role to the account.

Retrying again after some time fixed the issue.

Points to keep in mind –

https://docs.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-authorize-azure-active-directory#verify-role-assignments

Hope it helps..

Transfer files from local drive to Azure Blob using Premium File Transfer Task – SSIS


The Premium File Transfer Task component of KingswaySoft can be used to easily transfer files from local drive to Azure Blob storage.

https://www.kingswaysoft.com/products/ssis-productivity-pack/help-manual/premium-file-pack/premium-file-transfer-task

Let us take a simple example to see it in action.

Here we will pick the folder Customers and its subfolders along with the files inside it and will transfer it’s content to the Azure Blob Container.

Create a new SSIS Package and drag the Premium File Transfer Task to the control flow designer.

Double click the task and specify the following Source Properties

  • Action – Send Files – the other options are – Delete files, Create Directory, Remove Directory.
  • Check the option – Include Subdirectories
  • Connection Manager – Local File
  • Directory Path – specify the folder

Similarly, for Destination, we can specify the Azure Blog Storage Connection Manager and the directory path as shown below

Note – Make sure we have already added the connection for it to be available inside the connection manager option of Premium File Transfer Task. The other connection types supported are FTPS, SFT, Amazon S3, Azure Data Lake Storage, Box, Dropbox, Hadoop, OneDrive, SharePoint.

Let us run the package.

We can see the content transferred successfully to Azure Blog storage

 

 

Also, check out –

Using Azure Blob Storage component with Dynamics 365

https://nishantrana.me/2020/10/20/using-kingswaysoft-azure-blob-storage-component-with-dynamics-365/

https://nishantrana.me/2020/10/16/ssis-kingswaysoft-and-dynamics-365/

Hope it helps..

Dynamic Data Masking (DDM) in SQL Server


Through the Dynamic Data Masking feature in SQL Server, we can hide the sensitive data by masking the data from the user who does not have permissions. (Here the data in the database is not changed).

There are 4 different functions to do that –

  • default – Entire column is masked.
  • partial – only works with string, for masking staring and / or ending characters of the column data.
  • email – shows only the first character of the column data and masks the rest.
  • random – only works with numeric, the column data is replaced by random values.

  • e.g. Create Table

  • e.g. Alter Table

  • To find the masking details applied on columns –

  • Mask permissions – 

  • Unmask and mask permission –

Granting UNMASK permission to the user allows to see the unmasked data.

  • For Azure SQL Database, we can enable and specify masking through the interface itself, select the Dynamic Data Masking option for the table, click on the Add mask button

Apply the masking format as needed.

Reference – Pluralsight – SQL Server Course 

Hope it helps..

Azure Architecture and Management – Introduction


Below are few key points on Azure architecture and management

  • Check availability of Azure products region-wise

https://azure.microsoft.com/en-us/global-infrastructure/services/

Below we have filtered it to see products available in UAE

We can also filter it further if we are looking for a specific product or service.

  • We can refer to the Data residency document to see where the data is being stored

https://azure.microsoft.com/en-us/global-infrastructure/data-residency/

  • We can use the Pricing Calculator for estimation.

https://azure.microsoft.com/en-us/pricing/calculator/

  • A resource inside Azure can be thought of manageable item in Azure that includes Virtual Machines, Web App, Database, etc.
  • A resource group is a container (contains metadata) of the resources.
  • It is through ARM – Azure Resource Manager the resources within Azure are managed.

Azure Portal, Azure PowerShell, Azure CLI, REST APIs are different ways of interacting with Azure Resource Manager.

  • Within Azure Portal, we can use Azure Cloud Shell to run PowerShell or CLI commands.

It will ask us to create a storage account if there aren’t any

  • To implement Infrastructure-as-Code, Azure has Azure Resource Manager Templates which are written in JSON, that uses a declarative syntax to define infrastructure and configuration for Azure resources for automating deployments.

https://docs.microsoft.com/en-us/azure/azure-resource-manager/templates/overview

  • Azure Advisor is a recommendation service that analyzes the configurations and usage of the Azure Resources and provides recommendations / best practices on different categories to optimize the Azure Deployment.

The advisor can be configured to run only on specific resources along with the option to edit the existing rules and set up alerts.

  • We can also use Azure Mobile App

https://azure.microsoft.com/en-us/features/azure-portal/mobile-app/

to monitor the health and status of Azure resources, run commands to manage Azure resources, etc.

Reference – Pluralsight Course – Microsoft Azure Services and Concepts

Hope it helps..