Always Encrypted feature in SQL Server


Always Encrypted is used for encryption at the column level rather than the entire database. It provides both data at rest as well in memory (in flight).

It is different from column (cell-level) and Transparent Data Encryption (TDE) which uses keys and certificates, which are stored in the database. In the case of Always Encrypted, keys are managed outside the database. SQL Server cannot decrypt on its own, as it is the client who encrypts / decrypts on the fly, separating those who own the data from the one who manages it.

To enable Always Encrypted we can use either T-SQL or the Always Encryption Wizard in SSMS. It supports both Randomized and Deterministic Encryption types.

  • Columns are encrypted using CEK – Column Encryption Key.
  • The encrypted versions of each CEK are stored in the database.
  • CMK – Column Master Key is used to encrypt all the CEKs.
  • Only the client possesses the CMK which is stored in either Azure Key Vault, Certificate Store, HSM.

SSMS Wizard generates both CEK and CMK. CMK is stored in the Client certificate or Azure Key Vault.

CMK is used to encrypt CEK. The encrypted CEK along with Path to CMK is what is stored in the database. So, the database has no way of decrypting the data.

Next SSMS Wizard begins the encryption process, where it creates a new temporary table, transfers the data from the table and encrypts it on the fly, and then drops and replaces the main table.

All of this occurs on the fly at the client side which is SSMS in this case.

The client will pass the ‘Column Encryption Setting=Enabled’ in the connection string, which fetches the Encrypted CEK and Path to CMK from the database.

The client uses the Path to CMK to get the CMK and uses it to decrypt the encrypted CEK received and then uses the CEK to decrypt the encrypted data.

To apply Always Encrypted, within SSMS, right-click the database and select Tasks > Encrypt Columns…

Below we have selected the Email Id column of MyContact table.

The below wizard step is used for generating CMK which can be either stored in Windows Certificate Store or Azure Key Vault with the client.

We can either run the encryption at that time or choose to generate a PowerShell Script which we can run later.

Below is the summary of steps that will be performed

  • Before encryption

  • After encryption

To decrypt here, at the client which is our SSMS in this case, right-click in the query window and select Connection > Choose Connection…


Specify column encryption setting=enabled in the Additional Connection Parameters tab.

We can see the actual data as the client SSMS in this case, peforms the decryption at the run time using the path to CMK and encrypted CEK details received from the database.

  • Usage within a console application –

Few points to consider while planning to use Always Encrypted –

https://www.sentryone.com/blog/aaronbertrand/t-sql-tuesday-69-always-encrypted-limitations

Reference – Pluralsight – SQL Server Course 

Hope it helps..

Dynamic Data Masking (DDM) in SQL Server


Through the Dynamic Data Masking feature in SQL Server, we can hide the sensitive data by masking the data from the user who does not have permissions. (Here the data in the database is not changed).

There are 4 different functions to do that –

  • default – Entire column is masked.
  • partial – only works with string, for masking staring and / or ending characters of the column data.
  • email – shows only the first character of the column data and masks the rest.
  • random – only works with numeric, the column data is replaced by random values.

  • e.g. Create Table

  • e.g. Alter Table

  • To find the masking details applied on columns –

  • Mask permissions – 

  • Unmask and mask permission –

Granting UNMASK permission to the user allows to see the unmasked data.

  • For Azure SQL Database, we can enable and specify masking through the interface itself, select the Dynamic Data Masking option for the table, click on the Add mask button

Apply the masking format as needed.

Reference – Pluralsight – SQL Server Course 

Hope it helps..

Azure Architecture and Management – Introduction


Below are few key points on Azure architecture and management

  • Check availability of Azure products region-wise

https://azure.microsoft.com/en-us/global-infrastructure/services/

Below we have filtered it to see products available in UAE

We can also filter it further if we are looking for a specific product or service.

  • We can refer to the Data residency document to see where the data is being stored

https://azure.microsoft.com/en-us/global-infrastructure/data-residency/

  • We can use the Pricing Calculator for estimation.

https://azure.microsoft.com/en-us/pricing/calculator/

  • A resource inside Azure can be thought of manageable item in Azure that includes Virtual Machines, Web App, Database, etc.
  • A resource group is a container (contains metadata) of the resources.
  • It is through ARM – Azure Resource Manager the resources within Azure are managed.

Azure Portal, Azure PowerShell, Azure CLI, REST APIs are different ways of interacting with Azure Resource Manager.

  • Within Azure Portal, we can use Azure Cloud Shell to run PowerShell or CLI commands.

It will ask us to create a storage account if there aren’t any

  • To implement Infrastructure-as-Code, Azure has Azure Resource Manager Templates which are written in JSON, that uses a declarative syntax to define infrastructure and configuration for Azure resources for automating deployments.

https://docs.microsoft.com/en-us/azure/azure-resource-manager/templates/overview

  • Azure Advisor is a recommendation service that analyzes the configurations and usage of the Azure Resources and provides recommendations / best practices on different categories to optimize the Azure Deployment.

The advisor can be configured to run only on specific resources along with the option to edit the existing rules and set up alerts.

  • We can also use Azure Mobile App

https://azure.microsoft.com/en-us/features/azure-portal/mobile-app/

to monitor the health and status of Azure resources, run commands to manage Azure resources, etc.

Reference – Pluralsight Course – Microsoft Azure Services and Concepts

Hope it helps..

Microsoft Azure / Cloud Computing – Introduction


Cloud Computing enables companies to consume a complete resource – such as virtual machine, storage, or an application as a utility – just like electricity – rather than having to build and maintain computing infrastructure in-house.

Cloud provider provides the cloud infrastructure which is shared across multiple clients. Clients can select which service to use and only need to pay for that service, which is billed on-demand.

Cloud provider takes care of high availability (disk failure, power supply) and disaster recovery (natural or human disaster like fire, flood, etc.)

Advantages of using cloud computing

Rapid elasticity, only pay for the services used, reliability, economics of scale, etc.

Azure Regions Interactive Map

https://build5nines.com/map-azure-regions/

Types of Cloud Computing Services.

  • IaaS – Infrastructure as a service – Azure Virtual Machines, Azure Storage.
  • PaaS – Platform as a service – Azure Functions, Logic Apps, Azure Automation.
  • SaaS – Software as a service – Dynamics 365, SharePoint, Power Platform.

Types of Cloud computing deployment models.

Hope it helps..

Deploy and run SSIS Integration Toolkit for Dynamics 365 on Azure Data Lake (KingswaySoft)


In the previous post, we saw how to deploy and run SSIS packages on the cloud.

Here we take it one step further and will deploy and run the SSIS packages that use KingswaySoft’ s SSIS Integration Toolkit components.

Here we will need an Azure Subscription, where we will host the SSISDB, followed by provisioning Azure-SSIS Integration runtime instance.

We will also need the Azure Blob Storage account along with Azure Storage Explorer to upload the installation files of the SSIS Integration Toolkit.

Let us first start by creating an Azure SQL Server instance.

We have specified the below details.

Now next create the database inside the server.

Now with Azure SQL Server and Database created, the next step is to create the Storage account.

With the Azure Storage created, now let us connect to Azure using the Azure Storage Explorer.

Create a new blob container in the Azure Storage account created.

For the blob container created, right-click and select Get Shared Access Signature

Specify the expiry time along with Write permissions, this is for logging purpose when the Azure-SSIS IR is being provisioned.

Copy the URL (it will be used in the PowerShell script later)

Now let us get the installation files and programs from the KingswaySoft Shared Blob Container, which we’d place in the blob container we just created.

https://kingswaysoftgeneral.blob.core.windows.net/ssis-integration-toolkit-ultimate?st=2019-07-04T16%3A10%3A25Z&se=2059-07-05T16%3A10%3A00Z&sp=rl&sv=2018-03-28&sr=c&sig=LAGvouFpkZHEk%2BH8%2B0pK%2FDNg7B3jPUf%2FJ91%2BJ%2FEeKg0%3D

Right-click Storage Accounts and select Connect to Azure Storage

Select Use a shared access signature (SAS) URI

Paste the KingswaySoft blob container URL.

We can see the below contents added to the blob container.

Select all and copy all the files.

Paste it in the blob container we had created earlier.

With things now setup, let us get the PowerShell script to provision the Azure-SSIS Integration Runtime Initializations.ps1 and update it.

Specify the appropriate values and run the script. Get the Azure PowerShell.

Also, make sure to update the firewall rules to allow the client to connect.

Update the PowerShell Script appropriately

We can check the status as shown below.

In parallel, we can see our Azure Data Factory created with the integration runtime, which is in Starting status.

After a few minutes, we will have integration runtime up and running.

Below is our SSIS Package that we would be deploying to the cloud.

It uses Data Spawner Component to generate test data for Contacts and the CDS Destination component to create those records inside CDS.

Right-click the integration project and select Deploy

Specify connection details along with Path

After successful deployment, let us create a new pipeline inside the Azure Data Factory.

Drag and drop the Execute SSIS Package and click on the Settings tab.

Connect to the package deployed followed by Validate and Debug to test the pipeline.

The pipeline will be in Queued status

After successful execution,

navigate to our Dynamics 365 Sales Hub

We can see 10 contact records created by the SSIS Package.

Hope it helps..

Deploy and run SSIS package in Azure Data Factory


Before the SSIS package can be deployed to Azure Data Factory we need to provision Azure-SQL Server Integration Service (SSIS) runtime (IR) in Azure Data Factory.

In the previous posts, we had created an Azure data factory instance had used Azure SQL Database as the source.

Within Azure Data Factory in the Let’s get started page, select Configure SSIS Integration.

Specify the appropriate values to integration runtime.

Select Create SSIS Catalog option to deploy packages in SSISDB, provide Azure SQL Database server endpoint, and the admin credentials to connect.

Test the connection.

Specify advanced settings as appropriate.

This starts the creation of Azure-SSIS Integration Runtime.

Meanwhile below is our SSIS package that we would be deploying to Azure Data Factory.

It extracts a text file named contacts.txt from the blob source and loads it into destination blog storage.

Right-click the project  and select Deploy.

(Deploying individual package is not supported right now)

Select SSIS in Azure Data Factory.

Specify Server name and credentials and connect.

Click on Browse.

Create a new folder or select an existing folder and click on Ok

Once the validation is successful, click on Deploy and start the deployment.

After successful deployment, create a new pipeline in the Azure Data Factory, and drag the Execute SSIS Package activity

Connect to the package deployed.

Click on debug to trigger and test the pipeline.

On the successful run, we can see the contact.txt file extracted from mycontainer1 and loaded to mycontainer2.

Hope it helps..