Category Archives: October 2017

Move your ‘On-Prem’ SQL Data Warehouse To ‘Azure SQL Data Warehouse’ With Redgate Data Platform Studio (DPS)

Introduction

 

In this article we will learn how to migrate ‘On-Prem’ data warehouse to the Cloud, specifically Azure SQL Data warehouse. I will share the lesson learned and some do’s and don’ts from my experience that may be helpful.

Let’s talk about a ‘Great Tool’ – ‘Redgate DPS Gateway’ which worked perfectly fine for real time data migration challenges for migration of 1000 Tables having 10 GB Data.

We will walk through the ‘REAL TIME’ example. This article is not referring to,

  • Adventure Works Sample Data Warehouse migration
  • A couple  of demo table migrations

This is a real time extensive migration example.

Prerequisite

  1. SQL Server on-premises installed
  2. Your Enterprise or Demo database available on SQL Server on-premises
  3. An Azure SQL Data Warehouse account, if still don’t have Please Create ‘Azure SQL Data warehouse here’

Still if you have some questions on Azure SQL Data Warehouse like,

Why Enterprises Should Now Consider ASDWH Seriously please read here

Background

Most recently, I got an opportunity to migrate the ‘On-Prem’ Data warehouse to cloud at ‘Azure SQL Data Warehouse (ASDWH)’. This article is from that standpoint of view.

Getting Started

I am sure that you already have an Azure SQL Data Warehouse (ASDWH), so please LOGIN there and go to the ASDWH

Azure

Under Common Task, please Click Load Data as shown below,

Azure

After Clicking Load Data, you would see below ‘Load Data’ screen appeared

Azure

Here we have ‘Two Options’

  1. redgate Data Platform Studio
  2. Azure Data Factory

For Azure Data factory details, please wait until my next article.

So now let’s click on redgate-Data Platform Studio. Please login with Azure login credentials and a new account will be created for you to login to the Data Platform Studio (DPS).

Azure

After login in the DPS, Click the ‘START NEW IMPORT’ button.

I am very happy to see ‘Jonathan’ – an online assistant, however, he was not responding except  for the first default message, seems he is vacationing

Azure

Azure
If you are a first time user with DPS, you need to create a ‘Gateway’ to connect SQL ‘on-prem’ Data warehouse to ‘Azure SQL Data warehouse’ like this,

Azure

Click the Download link to download the DPS Gateway installer,

Azure

Azure

Once downloaded, please install DPS gateway. After installation, the gateway will detect local SQL Server on-premises instance. Press enter to connect to it and press next:

Azure

Azure

Select your associated ‘Storage Account’, please remember whatever storage account you will create remain always.

In case you need more information on ‘Azure Storage’, please read my ‘Azure Storage Series of 7 articles’ from here 

Azure

Give your Azure Server Admin credentials

Azure

And this beautiful screen will appear. In my case, I select ‘DELETE ALL EXISTING OBJECTS BEFORE IMPORT’

Azure

After Continue, your Import screen will appear

Azure

Click, Start Import and you will land to this screen,

Azure

Please note, I am importing 951 tables and it took almost 40 minutes. Please note it also delete all records from Storage, I found it quite fast.

Azure

Azure

I hope this article will help you to understand how you can move your Complete Data into Azure SQL Data warehouse.

Please NOTE below points

  1. Redgate – DPS –> Not able to migrate Views and Stored Procedures. It’s meant to transfer your Whole Data and take all pain of migration and data conversion away

In case of any doubt or question, feel free to write message or connect by any social media, I will reply ASAP. You may also reach me at My Website from here.

Until next time, keep learning.

Advertisement

Migrate Your On-Premises / Enterprise Data Warehouse Into Azure SQL Data Warehouse

DownLoad- DataWarehouseMigrationUtility

In my previous articles, ‘Introduction To Azure SQL Data Warehouse’ and ‘Azure SQL Data Warehouse – Why Enterprises Should Now Consider It Seriously’, I shared –

  1. What Azure SQL Data Warehouse is
  2. Why Azure SQL Data Warehouse is critical for the enterprise and reasons for the migration

Now, I will share how you can start migrating your data into the Azure SQL Data Warehouse once you have decided to do so. The prerequisites are mentioned below.

Prerequisite

  1. Azure Subscription
  2. Azure SQL Data Warehouse knowledge, refer to ‘Introduction To Azure SQL Data Warehouse’ and DataWarehouseMigrationUtility‘Azure SQL Data Warehouse – Why Enterprises Should Now Consider It Seriously’

Once you have decided to migrate your on-premises or existing Data Warehouse, the next thing that comes to mind is ‘What Next,’ or how to import data. Fortunately, we have a couple of options for importing the data.

  1. Data Warehouse Migration Utility (Preview) that can be downloaded from here
  2. Azure Data Factory
  3. Move the data from ADLS to Azure Blob Storage and use PolyBase to import the data.
  4. SSIS

Considering the reasons from the following scenarios

Azure SQL Data Warehouse

Now, we will discuss how to import data from Data Warehouse Migration Utility. By clicking here the download will start. You need to extract and install it.

Azure SQL Data Warehouse

Click “Next” and check ‘I Accept’.

Azure SQL Data Warehouse

It will install DataWarehouseMigrationUtility on your desktop.

Azure SQL Data Warehouse

Open DataWarehouseMigrationUtility, please note that ‘SOURCE TYPE’ has 2 options:-

  1. SQL Server
  2. Azure SQL Database

‘DESTINATION TYPE’ has ‘Azure SQL Data Warehouse’ as shown below.

Azure SQL Data Warehouse

I am choosing ‘Azure SQL Database’ for this demo.

Azure SQL Data Warehouse

Click “Connect” and look over ‘Migration Settings’ and click ‘Save’.

Azure SQL Data Warehouse

Select ‘Your Database’ database and click ‘check Database Compatibility’. In case of any compatibility issues, it will give ‘Database Compatibility Report’ in Excel sheet, like this.

Azure SQL Data Warehouse

Type of errors

  1. Code Errors – Errors in code like ‘Stored Procedures’ and ‘User Defined Functions’
  2. Object Errors – Errors in Schemas etc…

Fix the related issue and click ‘Migrated Selected’. It will show the objects/ tables needed for migration. Select all or some as per your business need

Azure SQL Data Warehouse

Round Robin – will distribute data equally in the Compute Node. I always choose this as I don’t have to explicitly distribute it. We could check Skew Status, which helps us with ‘Distributed options’.

Once you are done, click on the “Migrate Schema” as shown below.

Azure SQL Data Warehouse

And, this will create ‘Database Schema’.

Azure SQL Data Warehouse

After the Schema gets generated, click “Run Script”.

Azure SQL Data Warehouse

And you have to give your SOURCE, i.e., SQL Server credentials to connect.  It will apply the script like this.

Azure SQL Data Warehouse

Once the schema is applied successfully, the only remaining thing is to migrate the data. So, click “Migrate Data” at the extreme right of the page.

Azure SQL Data Warehouse

Click “Generate”. It will generate a BCP Package at your specified directory.

Azure SQL Data Warehouse

There are two packages that need to be run in sequence, as shown below.

  1. Run Export Package – Export Data from SQL Instance to the place where we are running it.
  2. Run Import Package – Import Data into Azure SQL Data warehouse

You have just migrated your ‘On-Premises/Enterprise Data Warehouse’ to  ‘Azure SQL Data Warehouse’ successfully.

Getting Started With Azure Object Blob Storage

In the last article, Fundamentals Of Azure Storage , we went through the nitty-gritty of Azure Unstructured storage. If you have not gone through the above article, please take a moment to look over the ‘Fundamentals of Azure storage’.

Let’s focus on the Advanced Azure Object (Blob) storage,

Azure Object (Blob) storage

OBJECT STORAGE

  • It is a service that stores unstructured data, any size from a few bytes to 100s of GB.
  • Typical use cases are web and mobile application data, so if you have a website with very high scale services you would use BLOB storage.
  • It can also be used for Big Data scenarios such as IoT and device telemetry. For example, Genomics, where a Genome sequence is for a single person. could be TBs to 100s of TBs of data.
  • A more canonical use case for our world is backup and archiving of data. It’s a growing problem for Enterprises because its not cost effective to put that data on very expensive on premise HW and then have to manage it or migrate it when you retire the HW, so this is a great use case.

KEY FEATURES

  • You can store PBs to 10s of PBs to 100s of PBs, scale is not a problem.
  • I already talked about durability at Fundamentals , consistency and scaling up.
  • The last point I’d like to touch on is cost effectiveness. Services like Cool Storage can be a very cost effective as compared to the on premise cost for the same thing. Starting with raw array cost, service and support, under provisioning, migration time when buying a new array, the TOC is typically much lower when using a public cloud service.Azure Object (Blob) storage

WHY Plateform as a Service (PaaS) ?

  • We always say that BLOB storage is ideal for PaaS, the question is why? If you are building an application why is that the case?
  • Limitless scale and not having to worry about provisioning. As an application developer if you need a TB of data or PB of data or even if you don’t know, you’re going to be able to scale what you need when you need it.
  • It’s globally accessible and has the ability to deploy your data where your users or customers are. If you are part of an enterprise that has a regional presence, or you have customers that are globally present you can store the data locally and reduce the round trip times and improve the customer experiences. e.g We could deploy our data at Canada and NewZeland with in NO TIME.
  • Cost efficiency of course undeniable fact.
  • Storage Capability – A Petabyte is approximately 1,000 Terabytes or one million Gigabytes. It’s hard to visualize what a Petabyte could hold. 1 Petabyte could hold approximately 20 million 4-door filing cabinets full of text. It could hold 500 billion pages of standard printed text. (Just Visualize the Strength )Azure Object (Blob) storage

CONCEPTS

  • Let’s looks at some key concepts for Azure Storage and ultimately BLOB Storage.
  • Every Azure account has a subscription in which you can create one or more Storage Accounts, which are essentially logical containers for your data.
  • Today you can put up to a 500TB in each Storage Account and you can have up to 250 of them, but really you can have as many of them as you want, even if you need 100s of PBs of data.
  • Object Store systems don’t typically have traditional directory hierarchy’s, they just have a name. So we added a layer of “folders” which we call Containers. You can put your BLOB data inside the Containers, or you could also just put it in the root without any container if you like.
  • Every BLOB has a URI <ANNIMATE>.Azure Object (Blob) storage

BLOCK BLOBS

  • Probably the one you’ll use most often is called Block BLOBs. It’s called this because you store data in “blocks”. You can use it to store all kinds of unstructured data from small to large, pictures and videos for example (Youtube, LIVE Traffic Recording, Satellite Recording )

APPEND BLOBS

  • We also have another kind of BLOB that is less frequently used, but for those who need it Append BLOBs are essential. They are very much like Block BLOBs except that they are optimized for multi-writer append scenarios.
  • For example, Map Reduce jobs where you have multiple writers trying to append output data, or logfile writers from 100s or machines all trying to write logs. Append BLOBs function very well for append scenarios where you don’t have to lock the BLOB and incur concurrency issues.

PAGE BLOBS

  • Page BLOBs are big sparse files in which you can store 512b “pages” of data. You can read and write data randomly into a Page BLOB, effectively like a disk. We built Page BLOBS to underlie the Disks implementation for our VMs in Azure.
  • We also thought there may be some interesting customer use cases and scenarios for Page BLOBs so we exposed them externally through a REST API. Many of our internal Azure services use it, for example the Event Hubs Service stores data in Page BLOBs to make some of their access scenarios faster.
  • Microsoft have also external storage partners using Page BLOBS for services like file systems that do de-duplication of data.Azure Object (Blob) storage

Now, lets create our first Blob Storage (Prerequisite, Azure portal account)

  1. Login to Azure Portal and you will landed to DashboardAzure Object (Blob) storage
  2. Click + New ->Storage ->Storage Account under featured app, as shown belowAzure Object (Blob) storage
  3. Give details as shown below.Azure Object (Blob) storage

You could choose options as per your business need , and as we discussed earlier, you may click on ! sign for more details,

Azure Object (Blob) storage

Azure Object (Blob) storage

Choose location as per your geographical location, suggested data center at the same geographical location (Cental Canada in my case) and click ‘Create.

It will start deployment as shown below,

Azure Object (Blob) storage

If you don’t have container, it asked for same as shown below,

Azure Object (Blob) storage

Give container details, as shown below,

Azure Object (Blob) storage

Choose options as per business need,

Azure Object (Blob) storage

And click ‘Create’

You are now goof to move ahead

Azure Object (Blob) storage

Let’s upload some Images, as shown below

Azure Object (Blob) storage

And Click Upload , Image will be uploaded

Azure Object (Blob) storage

Access Blob Storage from Tool and Code

Now, let’s access Blob Storage 1st from Tool

  1. I have attached ‘Microsoft Storage Explorer’ tool to download and Setup.
  2. Please download and give your subscription and Storage account details and Sign in…Azure Object (Blob) storage

    Azure Object (Blob) storage

You can see the Image uploaded in the Storage account,

Azure Object (Blob) storage

Now, Access by Code.

Please find attached C# Running Code for same, you only need to change App Config setting, as shown below.

Azure Object (Blob) storage

Azure Object (Blob) storage

I have attached the ‘Demo Project’ and ‘Microsoft Azure Storage container’ for you to accelerate the understanding with practical implementation of the Storage Concept. I hope my Azure series of articles would be increasing the length and breadth of your Cloud knowledge base.

Until next time, happy learning!!