Monthly Archives: May 2017

Overview Of Microsoft Azure Storage – Part Three

This is the third part of the series on Azure storage. In this article, you’ll see how we can perform different operations on storage such as creating container, uploading blobs to the container, listing of blobs and deleting of blobs through the code. For this, you have to follow the prerequisites given below.

  • Microsoft Visual Studio 2015(.NET Framework 4.5.2).
  • Microsoft Azure Account.

Hence, let’s start and follow the steps in the order to perform one of the best Azure storage, which is Blob Storage. I think there is no need to tell you, what Blob Storage in Azure is? If you want to learn and start with Azure, you can go with my series. In order to work, you’ve got to create a storage account by signing in to Azure Portal.

After creating a storage account on Azure portal, open Visual Studio 2015 and create a new project.


Select MVC template and click OK button.


Here, my AzureCloudStorage project is ready for use. I need some references of Azure storage in my project and Azure storage account connection string. Right click on the project, add the Connected Service.


We are going to work with Azure storage account, so select Azure storage and click Configure button to configure your account.

This will ask you to enter your Azure credentials. After successful login, your storage account is automatically configured by it. If you don’t have any storage account, you can create a new storage account from the given option.


All is set now. I have to write all the necessary code in my controller. To create a controller, right click on to Controller’s folder and add a controller.


My Controller has been created. Here, add a method, which will return ActionResult. Under this method, create a Container for an existing storage account.


Create a Container in storage aAccount

Create an object of storage account, which represents our storage account information such as storage account name and key. The setting exists in Web Config file. To get all information, we parse a connection string from Cloud Configuration Manager. To create the Service client object, use CloudBlobClient credentials access to the blob Service. Now, create an object for CloudBlobContainer, which retrieves a reference to a container. Give a name to the container, which you want to create, unless a container already exists.



I’ve used ViewBag to show, if container doesn’t exist, then show container is created, else it’s not created on view. Also, I’ve used ViewBag to show container name on view.

  1. namespace AzureCloudStorage.Controllers
  2. {
  3.     public class MyStorageController : Controller
  4.     {
  5.         // Create a Container for existing Storage Account
  6.         public ActionResult CreateStorageContainer()
  7.         {
  8.             CloudStorageAccount userstorageAccount = CloudStorageAccount.Parse(
  9.             CloudConfigurationManager.GetSetting(“mystorageazure01_AzureStorageConnectionString”));
  10.             CloudBlobClient myBlobClient = userstorageAccount.CreateCloudBlobClient();
  11.             CloudBlobContainer myContainer = myBlobClient.GetContainerReference(“mystoragecontainer001”);
  12.             ViewBag.UploadSuccess = myContainer.CreateIfNotExists();
  13.             ViewBag.YourBlobContainerName = myContainer.Name;
  14.             return View();
  15.         }
  16.     }
  17. }

Resolve to references and the main references are given below.

  1. using Microsoft.Azure;
  2. using Microsoft.WindowsAzure.Storage;
  3. using Microsoft.WindowsAzure.Storage.Blob;


Perhaps, you know that views are used to display the data. Let’s create a view for the CreateStorageContainer method by right clicking it.


View Name would be as it is and use this view along with layout page.


Here is the some code snippet on view to display a result on the browser that if container has uploaded, then show a message of success, else the same name container already exists. The screenshot given below is for CreateStorageContainer.chtml.


Open _Layout.cshtml file and add an Action Link is required to create a storage account.


Run the Application by CTRL+F5. The Action Link text is available, so just click on to it to create a Container in your Storage Account (mystoragecontainer001).


After clicking it, you’ll be redirected to your controller method and container should be created on Azure portal successfully.


Let’s move to Azure portal and you will find Container has successfully been created under given storage account.


Now, again go back to dashboard and click Action Link to check, whether it will upload same name Container or not.


No, it shows Container already exists in your storage account.

Upload a Block Blob or Page Blob into Selected Container

As explained earlier in the articles, Azure storage supports a different kind of blob types such as page blob, block blob and append blob. In this demo, I am going to explain only block blob. I need to upload a blob from my local file. Here, I am creating an object of CloudBlockBlob and retrieving a reference of block blob. From the myblob reference object’s UploadFromStream, you can upload your data stream and create a blob, if there any blob does not exist.

Here, I’m using session, which would be accessed into another ActionResult through the string and string object would be accessed into our View to display blob name.

Create a method to upload blob

Now, access previous sessions in string and ViewBag represents the string as its object and for container as well.

  1. //Upload Blobs To selected Container
  2.         public ActionResult UploadYourBlob( )
  3.         {
  4.             CloudStorageAccount userstorageAccount = CloudStorageAccount.Parse(
  5.             CloudConfigurationManager.GetSetting(“mystorageazure01_AzureStorageConnectionString”));
  6.             CloudBlobClient myBlobClient = userstorageAccount.CreateCloudBlobClient();
  7.             CloudBlobContainer myContainer = myBlobClient.GetContainerReference(“mystoragecontainer001”);
  8.             //get blob reference by entering its name
  9.             CloudBlockBlob myBlob = myContainer.GetBlockBlobReference(“lion3.jpg”);
  10.             //directory name from where you want to upload your blob.
  11.             using (var fileStream = System.IO.File.OpenRead(@“D:\Amit Mishra\lion3.jpg”))
  12.             {
  13.                 myBlob.UploadFromStream(fileStream);
  14.                 ViewBag.BlobName = myBlob.Name;
  15.             }
  16.             //hold container in viewbag using string that we will used to get container name onto browser
  17.             string getcontainer = Session[“getcontainer”].ToString();
  18.             ViewBag.YourBlobContainerName = getcontainer;
  19.             return View();
  20.         }

Right click near ActionResult method and add View for it.


Create a view to display the data.


Here, I am writing simple scripts in UploadYourBlob.chtml.

  1. @{
  2.     ViewBag.Title = “UploadYourBlob”;
  3.     Layout = “~/Views/Shared/_Layout.cshtml”;
  4. }
  5.     <h3>Uploaded…..☺</h3><br />
  6.     <h2>This Blob <b style=“color:purple”“@ViewBag.BlobName” </b>has Uploaded into your <b style=“color:red”“@ViewBag.YourBlobContainerName” </b> </h2>

Here, I want to show a simple demo. I want to show all the information on CreateStorageContainer.chstml view. Go to CreateStorageContainer.chstml and write some snippet, as shown below.


The code snippet given above shows the result given below.


If you click on blue text, it represents an action link. It will redirect to UploadYourBlob .cshtml view.


If you try to upload it again into the portal, then it shows the blob named Lion3.jpg, which already exists, as shown.

Get List of Your available blobs in Container

Now, the time of get list of blobs has come, which are available in your Container. Create an ActionResult, which will return the string of listblob in Container. Here, I am using blob list as a string

  1. public ActionResult ListingBlobs()
  2.         {
  3.             CloudStorageAccount userStorageAccount = CloudStorageAccount.Parse(
  4.              CloudConfigurationManager.GetSetting(“mystorageazure01_AzureStorageConnectionString”));
  5.             CloudBlobClient myBlobClient = userStorageAccount.CreateCloudBlobClient();
  6.             CloudBlobContainer myContainer = myBlobClient.GetContainerReference(“mystoragecontainer001”);
  7.             List<string> listblob = BlobList(myContainer);
  8.             return View(“CreateStorageContainer”, listblob);
  9.         }

Here, I am creating a common function, which returns list of blobs in string, which can be used in different methods. For list, we’ll use CloudBlobContainer.ListBlobs method, which should return IListBlobItem, according to your object.

  1. private static List<string> BlobList(CloudBlobContainer myContainer)
  2.         {
  3.             List<string> listblob = new List<string>();
  4.             foreach (IListBlobItem item in myContainer.ListBlobs(nullfalse))
  5.             {
  6.                 if (item.GetType() == typeof(CloudBlockBlob))
  7.                 {
  8.                     CloudBlockBlob myblob = (CloudBlockBlob)item;
  9.                     listblob.Add(myblob.Name);
  10.                 }
  11.                 else if (item.GetType() == typeof(CloudPageBlob))
  12.                 {
  13.                     CloudPageBlob myblob = (CloudPageBlob)item;
  14.                     listblob.Add(myblob.Name);
  15.                 }
  16.                 else if (item.GetType() == typeof(CloudBlobDirectory))
  17.                 {
  18.                     CloudBlobDirectory dir = (CloudBlobDirectory)item;
  19.                     listblob.Add(dir.Uri.ToString());
  20.                 }
  21.             }
  22.             return listblob;
  23.         }

Common function, which returns list of blobs as in string is CreateStorageContainer method.

  1. List<string> listblob = BlobList(myContainer);

Now, right click on to ListingBlobs and add a view.


In view, write the code snippet given below, as shown below.


Open CreateStorageContianer.cshtml file and add get list from Model. Update your View from the code given below.

  1. @model List<string>
  2. @{
  3.     ViewBag.Title = “CreateStorageContainer”;
  4.     Layout = “~/Views/Shared/_Layout.cshtml”;
  5. }
  6. <h2>Create Storage Container Into Storage Account</h2>
  7. <h3>
  8.     Hey This Container <b style=“color:red”“@ViewBag.YourBlobContainerName” </b>
  9.     @(ViewBag.UploadSuccess == true ?
  10.         “Has Been Successfully Created.” : “Already Exist Into Your Storage Account.☻”)
  11. </h3><br />
  12. @if (ViewBag.UploadSuccess == true)
  13. {
  14.     <h4>
  15.         Your Container has been created now ☺ You can Upload Blobs(photos, videos, music, blogs etc..)<br /> into your Container by clicking on it.
  16.         @Html.ActionLink(“Click Me To Upload a Blob”“UploadYourBlob”“MyStorage”)
  17.     </h4>
  18. }
  19. else
  20. {
  21.     <h4>
  22.         Your Container already exist ☺ You can Upload Blobs(photos, videos, music, blogs etc..)<br /> into your Container by clicking on it.
  23.         <b>@Html.ActionLink(“Click Me To Upload a Blob”“UploadYourBlob”“MyStorage”)</b>
  24.     </h4>
  25. }
  26. @Html.Partial(“~/Views/MyStorage/ListingBlobs.cshtml”, Model.ToList())


Run the Application. If there is any blob available in your container, then you can see the list of blobs.


Download Your Blob from Container

Add download and delete the link to the ListingBlobs.cshtml View.


When you’ll click on file, the blob should be downloaded.


Let’s write the code for Download blob and myBlob.DownloadToStream method transfers content of blob to the stream object. Give a path, where you want to download the blob.

  1. public ActionResult DownloadYourBlob()
  2.         {
  3.             CloudStorageAccount userstorageAccount = CloudStorageAccount.Parse(
  4.             CloudConfigurationManager.GetSetting(“mystorageazure01_AzureStorageConnectionString”));
  5.             CloudBlobClient myBlobClient = userstorageAccount.CreateCloudBlobClient();
  6.             CloudBlobContainer myContainer = myBlobClient.GetContainerReference(“mystoragecontainer001”);
  7.             CloudBlockBlob myBlob = myContainer.GetBlockBlobReference(“lion1.jpg”);
  8.             //for delete blob
  9.             using (var fileStream = System.IO.File.OpenWrite(@“D:\downloads\”))
  10.             {
  11.                 myBlob.DownloadToStream(fileStream);
  12.                 ViewBag.Name = myBlob.Name;
  13.             }
  14.             string getcontainer = Session[“getcontainer”].ToString();
  15.             ViewBag.YourBlobContainerName = getcontainer;
  16.             return View();
  17.         }

Add a view of the current method.


Click Add.


In the DownloadYourBlob.chtml, write the snippets given below.

  1. @{
  2.     ViewBag.Title = “DownloadYourBlob”;
  3.     Layout = “~/Views/Shared/_Layout.cshtml”;
  4. }
  5. <h2>Blob has downloaded</h2>
  6. <b style=“color:purple”“@ViewBag.Name” </b>downloaded to The Folder

Run your Application once again and click to download from the list of blobs.


The blob should be downloaded.


If you want to delete the blob,which you downloaded recently, you have to make another ActionResult for deleting blobs.

Write the same code, find the object of CloudBlobClient and call its delete method.

  1. public ActionResult DeleteYourBlob()
  2.         {
  3.             CloudStorageAccount userstorageAccount = CloudStorageAccount.Parse(
  4.             CloudConfigurationManager.GetSetting(“mystorageazure01_AzureStorageConnectionString”));
  5.             CloudBlobClient myBlobClient = userstorageAccount.CreateCloudBlobClient();
  6.             CloudBlobContainer myContainer = myBlobClient.GetContainerReference(“mystoragecontainer001”);
  7.             CloudBlockBlob myBlob = myContainer.GetBlockBlobReference(“lion1.jpg”);
  8.             myBlob.Delete();
  9.             ViewBag.DeleteBlob = myBlob.Name;
  10.             return View();
  11.         }

Add a view for this ActionResult and write the snippets given below.

  1. @{
  2.     ViewBag.Title = “DeleteYourBlob”;
  3.     Layout = “~/Views/Shared/_Layout.cshtml”;
  4. }
  5. <h2>Blob has Deleted</h2>
  6. <b style=“color:purple”“@ViewBag.DeleteName” </b>Deleted from The Folder

Click Delete the lion1.jpg blob.


The downloaded blob has been deleted successfully.


If you want to upload more containers and blobs into your Azure storage account, then you can only change the name of container and blob into the code.

Overview Of Microsoft Azure Storage – Part Two

In this article of the series, I’m going to show how we can manage our Azure Account by using Azure PowerShell ISE, by which we’ll see how we can perform different operations on storage and the same also with Azure Explorer. There are some commands that help access the Azure Account.

Windows Azure PowerShell ISE (Integrated Scripting Environment)

It is a Windows based GUI and framework from Microsoft. It has a predefined set of commands along with non-scripting language. PowerShell is a Command-line Shell which is similar to the command window, interactive editor built on .Net framework. When we are writing scripts on PowerShell, we have complete access to entire .NET framework, such as all the classes, different methods, and data types.

PowerShell ISE has Commands Add-on Pane which helps in easily creating different scripts having to type all the commands in the command line conveniently.

Along with Commands Add-on Pane, we have script pane for writing the scripts, and Command Pane. You can toggle the script and commands On and Off from the View menu. With the help of Command-Add-On Pane, you don’t need to type any command on Command Pane, just select the command whatever you want to run from Commands Add-On Pane and run it.

Commands Add-On pane has several properties to perform your commands. For example – if you want to work only with Azure Storage Account, then just select Azure.Storage from the module. Then only, the list of Azure Storage commands is shown on your screen. You can write the module name in the Name section whatever you want to work with. Let’s assume if you select Azure.Storage module and want to perform a command from the available list, just select it and click on to Insert. It will ask showing parameters – just  click on Run to run a command.


Well, the PowerShell ISE is installed by default with Windows 10 when you’re installing PowerShell from SDK. In modules, select Azure.


You can see the list of Azure related commands. Here I need to add my Azure account credentials with PowerShell, just select one of the highlight command as shown in following screenshot


Click on Show Details to add parameters.


Click on insert to insert command on Command pane, and click on Run to run the command.


That will open a page for Microsoft account. Enter your credentials to add account with PowerShell ISE.


When your account will be added then PowerShell returned your Azure account Id to ensure that you’re connected.


In my Azure Account, I have only one subscription that is “Visual Studio Enterprise with MSDN”. If you want to confirm your subscription run a command



If you want to read all commands then you can run a command to show a list of commands with description.

Run- Get-Command


Create a new storage account.

For creating a new Storage Account give a storage account name (lower letters) and your specific location where you want to create account.

New-AzureStorageAccount –StorageAccountName “mystoragecontainer002” –Location “East Asia”


Get your Storage Account details on PowerShell cmdlet that will return a full detail of your storage account.



Check onto Azure Portal to see if our new storage is created or not.


Yes it has been created, now I want to create a Container in this Account. If you write a command of creating a container, the Azure PowerShell command didn’t recognized in which Storage Account you want to create your Container. Here I have to tell Azure PowerShell which account I want to use by setting a default storage account. To perform all operations to your default storage account write  thefollowing command

Set default storage account to new storage account

Set-AzureSubscription -CurrentStorageAccountName “mystoragecontainer002” -SubscriptionName “Visual Studio Enterprise with MSDN”


Get Azure Storage Account Keys through Storage Account Name

Get-AzureStorageKey –StorageAccountName “mystoragecontainer002”


Create a Container

To create a new Container give a name (lower case) to your Container. In previous articles I’ve discussed that for accessing your data from containers you have  to change permission for anonymous read access. If you give Permission as Off to your Container then anonymous users can’t access your data. You can set Permission to Off or Blob or Container as discussed in one of the series’ articles.

New-AzureStorageContainer -Name “mycontainerforazure” -Permission Off


Check onto Azure Portal the Container should be created.


Retrieve a Container by Name

If you want to retrieve a container by its name you can run the following command

Get-AzureStorageContainer –Name “mycontainerforazure”


Upload a Blob into Container

You know that every blob resides in a particular Container so I need to pass container name in which I want to upload a blob with a appropriate path of file in the following command

Set-AzureStorageBlobContent -Container “mycontainerforazure” -File “D:\downloads\lion.jpg”

In the following screenshot you can see I’ve uploaded two different blobs into the same container.


Retrieve a list of available blobs from Container

This command retrieves all the blobs that are in your Container.

Get-AzureStorageBlob -Container “mycontainerforazure”


Delete a Selected Blob from Container

To delete a blob you must know the blob name as shown in following screenshot.


Download Blobs into Local Folder

If you want to download your blobs into your local directory then you have to write couple of commands.

  1. Get a reference to a list of all blobs in a container
    $blobs=Get-AzureStorageBlob -Container “mycontainerforazure”
  1. Create the destination directory where you want to save your items.
    New-Item -Path “D:\downloads” -ItemType Directory –Force


  1. This command will Download blobs into your local destination directory
    $blobs=Get-AzureStorageBlob -Destination “D:\downloads”

The below screenshot shows this type of confirmation because I already have the same blobs under the same folder.


Wait for a second, it retrieves all blobs which are in Container onto PowerShell cmdlets.


Unlike Blob Storage, you can perform commands against Table Storage, Queue Storage and File Storage on Azure PowerShell ISE.

Overview Of Microsoft Azure Storage – Part One

Azure Storage is one of the greatest cloud storage foundations for modern applications. You can access the Azure Scalable Storage Services and applications in a global fashion anywhere. You can store large amounts of data (objects) using Azure Storage account. By creating an Azure Storage account, you can access Azure Blob, Table, Queue, and File Share Services. The data in respective storage is available only for the user. So, the IT Professionals, business makers, and developers can take advantages of it in several technologies. One more thing that Azure Storage provides is storage availability to the Azure VM (Virtual Machine) that would you read in further articles.

Azure Storage is capable of supporting a set of Operating Systems and a number of programming languages for the scalable and convenient development. Azure Storage uses simple Rest APIs so that anyone can send and receive data via HTTP/HTTPS. These are the following types of storage available for you.

  1. Blob Storage
  2. Table Storage
  3. Queue Storage
  4. File Storage

These are the Standard Storages.

Azure Storage

Azure VM Disk is a place to store any kind of data, applications, and operating system. This is known as Premium Storage within the Azure. For using Storage availability, we have to create an Azure Storage Account.

So, let’s create an Azure Storage Account.

Log into Microsoft Azure account using your credentials.

Just go to new portal of Azure and follow the steps given here to create a Storage account.

Go to New> Storage>Storage Account.

Azure Storage

You need to fill in some information for creating your account.

  1. Storage Account Name

    You’ve to enter a unique name for the account. The letters should be in lower case. When it finds a unique name, it’ll check a green mark.
  2. Deployment Model

    The deployment models identify how you’re deploying and managing Azure Solutions and also, they ensure how you can deploy Azure resources in a group. Azure Resource Manager provides a way of managing Azure resources in a group. On the other hand, with the Classic Model, you can only work with Classic resources.

  3. Account Kind

    Let’s choose the kind of your account. It gives two options – General Purpose and Blob Storage that stores your unstructured data as blobs.

  4. Performance
    Standard Storage Account

    Where you can store unstructured object data storage (Blob), Tables, Queues, Files, and Azure VM under a specific single account.

    Premium Storage Account

    It is supported only by Azure Virtual Machine disks. This account is known for high performance

  1. Replication

    You’ve to select one of the following replication options in the given dropdown list,
    Zone-redundant storage (ZRS) – It stores three copies of your storage in the multiple data center in the same region that you’ve selected.

    Locally redundant storage (LRS) – It stores three copies of your storage in the data center region that you have selected.

    Geo-redundant storage (GRS) – When the primary region is goes down still data is available instantly for Read Only Access in the secondary region it stores six copies of your storage in primary region and the other three copies will be stored in secondary region.

    Read-access geo-redundant storage (RA-GRS) – It stores three copies of your storage in primary region and the other three copies will be stored in secondary region.

  1. Storage Service Encryption

    It encrypts your data when you write into Azure Storage. This is for the protection for your data.
  2. Resource Group

    You can manage your storage detail into one Resource group either by creating new resource group or with existing.
  3. Location

    Your preferred location.

After completing all the necessary steps, you can click on the “Create” button to create your Storage account.

Azure Storage

It’ll take some time to create your account. You can find your Storage account within your resource group by clicking on All Resources.

Azure Storage

Click on to your Storage account. You can access your storage account features from the given list, such as properties, access keys, activity log and many more.

Azure Storage

When you click on “Overview”, you’ll see all fundamental Storage services, such as Blobs, Tables, Files and Queues. In this article, I’m going to explain about one of the best storages, i.e., Blob Storage in Azure.

Blobs in Azure

Blob Service provides a scalable and highly available service to store large files in the cloud. It has some limitations and constraints. The Blob Service stores arbitrary large amounts of unstructured object binary and text data using a key value system. Blob Storage can be used to store and retrieve Binary Large Objects (BLOBs) from anywhere in the world via HTTP or HTTPS. A single blob can be hundreds of Gigabytes in size. A blob stores a file of any types and size.

  • Storing files for distributed access whether you want to access from Azure Explorer or directly from any other resource.
  • Streaming video, audio, images, CSS files, and PDF files.
  • You can access Blobs data from any computer.
  • Serving images and documents directly to a browser without authentication tokens.

    Azure Storage


For the Blob Storage, we first create a container. A container provides a grouping of a set of Blobs. All blobs reside in a container. A container can store an unlimited number of blobs, and in these blobs, we can store our unstructured data. See the panel where I don’t have any container. So, let’s click on Container to create a new one.

Azure Storage

Give a unique name to your container and if you want to access your blob only by yourself, then select Private Access. If you select Blob, then the Blob data of that container can be read by anonymous request. If you select Container, then the Container and Blob data can be read via anonymous user.

Click on “Create” to create a Container.

Azure Storage

My container has been created successfully.

Azure Storage

As I said, after creating container, you can upload your Blob. Click on “Upload”.

Azure Storage

There are three types of blob that can be stored in Azure Storage: Block Blob, Page Blob, and Append Blob. Most of the files are block blobs.

Azure Storage

Block Blob

A single block blob can be upto 200 GB in size. You can upload large blobs efficiently with block. A block can be of upto a maximum of 100 MB and a Block Blob can include upto 50,000 blocks.

Azure Storage

I’ve uploaded a MP4 video. Let’s see how it works.

Azure Storage

Azure Storage

Click on “Access policy” to change your access.

Azure Storage

Anonymous user can access my blob, so I’ve to change my access type to the Blob.

Azure Storage

I would like to serve it to the browser, so just click on your blob and copy the URL.

Azure Storage

Paste the URL in the browser, and you’ll found that it is working.

Azure Storage

Page Blob

Page type blobs are used for read/write operations and can be upto 1TB in size. It has Index based data structure.

Azure Storage

Append Blob

An Append Blob can be up to maximum of 4 MB. This is optimized for the fast append operations. This is most similar to page blob.

Azure Storage

You can upload a Blob using PowerShell, using Azure Explorer, and with Visual Studio by using Storage Account Access Keys. We will learn this in my upcoming articles. I’ll discuss Table, Queue and File Storage also in my next article.

Getting Started With Azure Object Blob Storage

In the last article, Fundamentals Of Azure Storage , we went through the nitty-gritty of Azure Unstructured storage. If you have not gone through the above article, please take a moment to look over the ‘Fundamentals of Azure storage’.

Let’s focus on the Advanced Azure Object (Blob) storage,

Azure Object (Blob) storage


  • It is a service that stores unstructured data, any size from a few bytes to 100s of GB.
  • Typical use cases are web and mobile application data, so if you have a website with very high scale services you would use BLOB storage.
  • It can also be used for Big Data scenarios such as IoT and device telemetry. For example, Genomics, where a Genome sequence is for a single person. could be TBs to 100s of TBs of data.
  • A more canonical use case for our world is backup and archiving of data. It’s a growing problem for Enterprises because its not cost effective to put that data on very expensive on premise HW and then have to manage it or migrate it when you retire the HW, so this is a great use case.


  • You can store PBs to 10s of PBs to 100s of PBs, scale is not a problem.
  • I already talked about durability at Fundamentals , consistency and scaling up.
  • The last point I’d like to touch on is cost effectiveness. Services like Cool Storage can be a very cost effective as compared to the on premise cost for the same thing. Starting with raw array cost, service and support, under provisioning, migration time when buying a new array, the TOC is typically much lower when using a public cloud service.Azure Object (Blob) storage

WHY Plateform as a Service (PaaS) ?

  • We always say that BLOB storage is ideal for PaaS, the question is why? If you are building an application why is that the case?
  • Limitless scale and not having to worry about provisioning. As an application developer if you need a TB of data or PB of data or even if you don’t know, you’re going to be able to scale what you need when you need it.
  • It’s globally accessible and has the ability to deploy your data where your users or customers are. If you are part of an enterprise that has a regional presence, or you have customers that are globally present you can store the data locally and reduce the round trip times and improve the customer experiences. e.g We could deploy our data at Canada and NewZeland with in NO TIME.
  • Cost efficiency of course undeniable fact.
  • Storage Capability – A Petabyte is approximately 1,000 Terabytes or one million Gigabytes. It’s hard to visualize what a Petabyte could hold. 1 Petabyte could hold approximately 20 million 4-door filing cabinets full of text. It could hold 500 billion pages of standard printed text. (Just Visualize the Strength )Azure Object (Blob) storage


  • Let’s looks at some key concepts for Azure Storage and ultimately BLOB Storage.
  • Every Azure account has a subscription in which you can create one or more Storage Accounts, which are essentially logical containers for your data.
  • Today you can put up to a 500TB in each Storage Account and you can have up to 250 of them, but really you can have as many of them as you want, even if you need 100s of PBs of data.
  • Object Store systems don’t typically have traditional directory hierarchy’s, they just have a name. So we added a layer of “folders” which we call Containers. You can put your BLOB data inside the Containers, or you could also just put it in the root without any container if you like.
  • Every BLOB has a URI <ANNIMATE>.Azure Object (Blob) storage


  • Probably the one you’ll use most often is called Block BLOBs. It’s called this because you store data in “blocks”. You can use it to store all kinds of unstructured data from small to large, pictures and videos for example (Youtube, LIVE Traffic Recording, Satellite Recording )


  • We also have another kind of BLOB that is less frequently used, but for those who need it Append BLOBs are essential. They are very much like Block BLOBs except that they are optimized for multi-writer append scenarios.
  • For example, Map Reduce jobs where you have multiple writers trying to append output data, or logfile writers from 100s or machines all trying to write logs. Append BLOBs function very well for append scenarios where you don’t have to lock the BLOB and incur concurrency issues.


  • Page BLOBs are big sparse files in which you can store 512b “pages” of data. You can read and write data randomly into a Page BLOB, effectively like a disk. We built Page BLOBS to underlie the Disks implementation for our VMs in Azure.
  • We also thought there may be some interesting customer use cases and scenarios for Page BLOBs so we exposed them externally through a REST API. Many of our internal Azure services use it, for example the Event Hubs Service stores data in Page BLOBs to make some of their access scenarios faster.
  • Microsoft have also external storage partners using Page BLOBS for services like file systems that do de-duplication of data.Azure Object (Blob) storage

Now, lets create our first Blob Storage (Prerequisite, Azure portal account)

  1. Login to Azure Portal and you will landed to DashboardAzure Object (Blob) storage
  2. Click + New ->Storage ->Storage Account under featured app, as shown belowAzure Object (Blob) storage
  3. Give details as shown below.Azure Object (Blob) storage

You could choose options as per your business need , and as we discussed earlier, you may click on ! sign for more details,

Azure Object (Blob) storage

Azure Object (Blob) storage

Choose location as per your geographical location, suggested data center at the same geographical location (Cental Canada in my case) and click ‘Create.

It will start deployment as shown below,

Azure Object (Blob) storage

If you don’t have container, it asked for same as shown below,

Azure Object (Blob) storage

Give container details, as shown below,

Azure Object (Blob) storage

Choose options as per business need,

Azure Object (Blob) storage

And click ‘Create’

You are now goof to move ahead

Azure Object (Blob) storage

Let’s upload some Images, as shown below

Azure Object (Blob) storage

And Click Upload , Image will be uploaded

Azure Object (Blob) storage

Access Blob Storage from Tool and Code

Now, let’s access Blob Storage 1st from Tool

  1. I have attached ‘Microsoft Storage Explorer’ tool to download and Setup.
  2. Please download and give your subscription and Storage account details and Sign in…Azure Object (Blob) storage

    Azure Object (Blob) storage

You can see the Image uploaded in the Storage account,

Azure Object (Blob) storage

Now, Access by Code.

Please find attached C# Running Code for same, you only need to change App Config setting, as shown below.

Azure Object (Blob) storage

Azure Object (Blob) storage

I have attached the ‘Demo Project’ and ‘Microsoft Azure Storage container’ for you to accelerate the understanding with practical implementation of the Storage Concept. I hope my Azure series of articles would be increasing the length and breadth of your Cloud knowledge base.

Until next time, happy learning!!

Fundamentals Of Azure Storage

In this article, I will illustrate how we can deliver high scale and low cost solutions with Azure tiered Cloud storage by covering the points given below.

  • Introduction to Azure storage.
  • Overview of Azure Object (Blob) Storage / Unstructured storage
  • Working with Blob storage
  • Key concepts of Blob storage
  • Architecture
  • Demo with all the steps (screenshots)


  1. Working VS demo of Blob storage project
  2. Storage tools

This article is divided into two parts, which are given below.

  1. Fundamental of Azure Tiered storage
  2. Advance Azure Object (Blob) storage

What is Azure storage and why do we need Azure storage?

Let’s consider the current storage challenges.

Scalability Limits

  • Let’s try to predict what storage your business will need over the next 1-2 years. We may need a very long capacity planning lifecycle like
    • What you have to buy, what you have to deploy etc.
    • How long would it take you to provision another PB of storage, how quickly you can acquire it, provision it and make it available?
    • Even on the small end, if your capacity is full, how long would it take you to provision even a TB and make it available? It means a purchasing event, rack and stack and deploy some gear.
    • What are the costs and ramifications, if you have to decommission storage?

      Consider the problems you face in the scaling up and scaling down of the traditional storage capacity planning model so far.


  • The data is one of the most important valuable assets to the business, so you can’t put it at risk.
  • Golden Rule: You can’t lose, corrupt it, or mess up the data. Make sure it can never be lost or worse, stolen.
  • Anytime, you are looking for the alternative solutions; the trustworthiness of the infrastructure solution, which will allow you to deploy your data and protect it.


  • There is an opportunity cost of constraining your business by preventing the ability to generate revenue because you didn’t provision enough capacity.
  • Paying the burden of under utilized capacity
  • Constant overhead of maintenance, migration, upkeep that doesn’t really add the value or unlock the potential of the data.
  • Big events can happen, expansion into a new region or enter a ‘New Geographical Location.’ What are the costs, how can we make it seamless?


  • All of the above activities and risk factors tax your business and prevent you from focusing on the innovation or the progress.
  • The storage should be an asset, which helps you to utilize your data, not something which prevents your business from being agile or utilizing the data and enable it to be competitive.

The key pillars of Azure are given below.

  • Limitless scalability
  • Trust
  • Cloud Economics

These are given in accordance with the intent of enabling the business agility for you and your customers.


  • Storage is a fundamental building block for all of Azure Services. Every Azure Service directly or indirectly stores their data in Azure Storage.
  • Microsoft’s consumer and enterprise Services are betting their business on Azure storage including XBOX, Skype, Office 365 and more.

The storage Service has a set of capabilities, which is worth talking.

Hyper scale

  • We have about 6 billion net new objects added to Azure and we will do almost 60 billion transactions on that data.
  • Add that up over a year and the numbers are truly staggering, 100’s of trillions of transactions, almost 22EB of data coming in and out of our system and the growth curve is still accelerating.
  • Azure is continuing to invest in the infrastructure, constantly taking on new customers every month, as we can see over 120K new subscriptions/ month.


  • Azure has a durable platform. Azure promises to never loose the client data.
  • Azure keeps multiple copies of the client data and responds to all the forms of Hardware (HW) failure.
  • Constant scrubbing of the data to protect against the corruption or bit rot by constantly reading the data and doing checksums and correcting any copies of the data.


  • Azure now provides encryption at Rest and also client libraries for an Application encryption.

Highly available

  • Azure is highly available and it is able to automatically recover.
  • Automatic dynamic load balancing. Thus, if you have a set of BLOBs or Table entities and all of a sudden; you have a million visitors ; try to download the object. If Azure storage will notice that traffic increase, then it dynamically scales out and starts serving that object over 100s or 1000s of Servers and then scales back down as the traffic subsides.


  • Azure Storage is focused on being the most open platform for the client data regardless of what devices or architectures.
  • REST APIs that client can code against, is an open and fully documented and Azure also has built a comprehensive set of the client libraries.
  • Azure storage’s goal does not make any difference; where the client data is or what type of the Applications, the client is building, the client can do it on a solid platform offering, which depends on it to manage your data at scale.


  • Azure storage is the only Cloud provider, which has a great Hybrid story.
  • With the release of Windows Server 2016, the new Azure Stack provides a 100% API consistent implementation for all our Services, which allows you to build the Applications and run them On Premise in a local host Service or in Azure public Cloud with zero code changes.
  • In addition, Azure storage also offers Azure Services, which bridges your datacenter to Cloud like StorSimple, Azure Backup, Azure Site Recovery.

In addition, Azure Storage is also used by every Azure Service directly or indirectly for the data persistence and by nearly every MS Services including O365 and Skype for Business on the enterprise side and OneDrive, Xbox and Skype on the consumer front

The storage Services have a number of capabilities, which are common across the board. Let’s go through each of them briefly.

Azure Storage Services

  • We can break Azure Storage Service broadly into the 2 deployment models, IaaS, where you’re running your own infrastructure or PaaS where you are using higher level services and applications which are developed against the public cloud infrastructure.
  • On the Infrastructure side (IaaS), azure provides DISKS and FILES, Disks are single instance Disks which are attached to VMs. There are two flavors, Standard and Premium. And Files are shared storage over SMB. Both are provided as Services to your VMs.
  • For PaaS, we have three core objects which are Objects, Tables and Queues.
  • Blob storage offers Exabyte scale and the ability to store trillions of objects across different tiers. Our current tiers are Hot storage for higher performance and frequently accessed data and Cool storage for less active & archived data.
  • Tables are our massively scalable NoSQL key-value stores for the structured and semi-structured data.
  • Queues Service is for reliable intra and inter-Service communications, which are great for decoupling your work flows.

All these Services are built on the common unified distributed storage system. Thus, all of the capabilities which we talked about previously are available to all of these Services.

Storage Durability

  • Data Platform Durability is critical, if you don’t offer it, you don’t have a platform ready for the enterprise.
  • Azure starts with the Distributed File System which deals with failures, dealing with checksums and data integrity in the background, but another interesting characteristic for us is strong consistency and that’s read after write consistency.
  • When you write the data for Azure platform, write it to three different disks on 3 different racks and Azure. Write back when it has been committed to all the three copies.
  • If you have an application that is reading and writing you never have this eventual consistency problem that some cloud platforms have.


  • LRS or Locally Redundant Storage, which has 3 replicas, 3 disks on 3 different racks in a single region, which protects against the failures at that level. the client never has to think about the traditional forms of backup such as RAID. You may still need backup for other things but for HW failures, it is no longer necessary.
  • Stepping up from there, if you need your data to be resilient against major regional disasters (earthquake, floods), we offer automatic geo-replication to another region 100s of miles away. The replication is async unlike LRS, which is synchronized and strongly consistent. Thus, within a couple of minutes; the data will be replicated to another datacenter.
  • The client can query the replication lag, so you know what your RPO is for an Application and use that secondary copy, as needed.
  • If you want to use the data for HA, we provide RA-GRS, which is read access. Thus, you get a second DNS endpoint for that second copy.

By going through the above Storage fundamentals, now you will be able to understand what is Azure storage and why we need Azure. It gives us prerequisite fundamental knowledge to advance further in the next article.

Until next time, happy learning.

Microsoft Azure And Azure Machine Learning

Today, we are going to take a look at the business needs of ‘Modern Cloud computing’ and try to understand the reasons  developers/programmers and IT professionals need cloud skills.

What is Azure? | Pay-As-You-Go

  • Microsoft Azure is a Cloud computing platform and infrastructure created by Microsoft for building, deploying and managing the Applications and Services through a global network of Microsoft-managed data centers.
  • Microsoft Azure is a growing collection of integrated Cloud Services—analytics, computing, database, mobile, networking, storage, and Web—for moving faster, achieving more and saving money.

As per Microsoft

  • “80% of Microsoft customers have already deployed or fully embraced the cloud. By 2020, the idea that a company has no cloud infrastructure may be as rare as a ‘no internet policy’ is today.”

Cloud considerations

These are some considerations with the cloud, which are given below.

  • You do not own the hardware.
  • Be prepared for failures.
  • Scale is unpredictable. Can you scale efficiently?
  • Managing Services is often harder than building the Services. Do you have the right operational telemetry for visibility?
  • No downtime upgrades.
  • Are the costs being understandable and controllable? Do you know the density and capacity of your Services?
  • Care upfront about security.
  • Developer productivity. Build the app and not a platform

What is Windows Azure?

Azure is a complete package of the following.

Why Use the Cloud?

Windows Azure Services

You take the name and you will find everything within Azure Services.

Battle for The Cloud is shown below.

From this image, it’s evident that Microsoft Azure is capturing market with the significant growth, so keep yourself ready.

Let’s take Microsoft Azure Portal’s overview

First question is about Azure subscription and the details are given below.

Enjoy Azure subscription from the following.

After subscribing, you will come across the screenshot given below.

You can get hands-on here and it will give you real time experience with Azure portal.

    • Create an app Service here and click Create.

    • Give all prerequisite details.

Click Create.

Congratulations, your Azure app is ready — access it and do all the developer activities.

Hence, let me summarize new learning in simple words

  1. What is Azure? (This is going to be the future of modern computing).
  2. By 2020, the idea that a company has no Cloud infrastructure may be as rare as a ‘no internet policy’ is today. Hence, catch the bus at the right time.
  3. Why to consider Azure (Business and Technical reasons).
  4. Why to use Cloud.
  5. Market Share for Azure (every year, its growing).
  6. Azure free scubscription.
  7. Window app Services with demo.

In Microsoft Azure & Azure Machine Learning Part -2, we will learn,

  1. Azure VM.
  2. Azure database.
  3. Azure data warehouse.
  4. Azure Machine Learning.

Keep learning.

Walkover to Microsoft Cloud (Azure) Security

Welcome again!!

Today, we will discuss about Microsoft Cloud Security from a curious customer questions perspective, before moving towards detailed technical understanding.
  1. Could we consider Cloud as a Secure Platform?

    I really don’t have any idea, and neither could I promise that,  but what I understood from my learning is that ‘Cloud Environment’ has better security as compared to the ‘On Premises Data Center’. Some of the reasons for Security of the ‘Microsoft Data Centers’ are –

    1. Controlled access/ Reachability to the Azure Data Centers. So far, no Azure Security breach has been reported.
    2. Technology perspective (Adhere to the Azure Security Development Lifecycle (SDL)).
    3. Authentication is managed by Multi-factor authentication (MFA).

      For more details, you have to navigate down.

  2. Is Owning Cloud Services Cheap (Save Money)?

    I would say ‘Yes’, because Microsoft provides the best infra and as an individual customer, probably it would not be possible to invest the ‘huge amount’ for Infrastructure Services.
  3. What are the major reasons which trigger you to choose Microsoft Cloud Security?

    I would say ‘Agility’ or ‘BUSINESS VALUE’. Please consider the real time issue of the ‘System Performance’ or ‘Application Performance’ of your ‘Production Server’.

    If you have an ‘On Premises Customer’, you may look over System Hardware, Server Configuration, Network Speed etc. Then, you would zero in on what exact changes the system needs, and then plan for the changes. It may take at least couple of weeks to months, as per Business Need.

    However, it would take only a couple of hours with the Cloud to ‘Scale Up’ your Servers. It’s a good gain from an organizational perspective. Indeed, it saves a couple of weeks/months and hence we have saved MONEY. Furthermore, it adds value to the business, which is ‘Super Important’.

    So now, let’s have some understanding of how ‘Cloud Security’ works?

As businesses are needed to be built as secure as we can make them, clients may have some concerns over the Data security, specifically Bank / Financial Clients. They may think twice whether the ‘CREDIT CARD’ or the commercial data is safe at the Cloud. I feel, we as a Consultant should have knowledge before any suggestion/ commitments.

So, as a customer, you could toss different questions.

  • Is our data in the cloud as secure as on premises data/ more or less secure?
  • How easily could someone  hack the cloud data?
  • How much percentage of Data would be vulnerable on the Cloud?
  • For hackers, I think cloud could be a ‘Golden Opportunity’ for data theft?

What you think, does Microsoft really don’t know about RISK or did they plan for this ‘At All’?

Certainly, one thing I could say is that the capability, resources, and Infrastructure of any Cloud Provider are much higher than an ‘On Premise Data warehouse’. And security has been ensured by many statistics analysis tools and basic analysis tool.

Security is ensured by various other means. For example, Cloud Active Directory (AD), which keeps a check about Login locations. If a customer logs in from North America in the morning, say 10 AM, he/she could not be logged in from Africa at 10:15 AM (example) and access would be restricted until further authentication.

So far, I have shared my way of thinking or my knowledge. Let’s see what security mechanism Microsoft Cloud follows.
  • Microsoft Azure is the cloud platform with many integrated tools, templates, and  services.
  • Azure leverages us to use our existing learning/expertise of the database, database warehouse, storage, web applications, networking, and computing services to build and manage applications aligned with the cloud.
  • Azure Security Development Lifecycle (SDL) ensures that everything from the initial phase to launch/deployment phase is secured.
  • Operational Security Assurance (OSA) provides us a platform to ensure secure operations throughout the lifecycle of the cloud based platform.
  • Azure Security Center (for more details refer to Microsoft Azure website) offers continuous monitoring by
    1. Secure Identity
    2. Secure Infrastructure
    3. Secure Applications and Data

Secure Identity

Azure Active Directory (AAD) ensures the access to only ‘Authorized Users’. So, Azure enables us to manage user credentials to protect abstract information. Furthermore, AAD ensures authentication, authorization, and access control etc.

Secure Infrastructure

Precisely, this is the biggest part of the Microsoft Cloud Security and a lot of actors play vital roles to achieve Infrastructure Security. Many of them are Azure Virtual Networks that ensure a safe practice to extend on-premises network to the cloud via VPN or WAN (Azure Express Route).

Unauthorized and unintentional exchange of the information between deployments in a multi-tenant architecture is averted by mentioned tactics.

  • Using Virtual local area network (VLAN) isolation.
  • Access control lists (ACLs), Load balancers.
  • Network address translation (NAT) separates internal network traffic from external traffic.
  • Regulated Traffic Flow procedures.

Microsoft Antimalware for Azure protects Azure Cloud Services and Virtual Machines, through web application firewalls, network firewalls, antimalware, intrusion detection and prevention systems (IDS/IPS), and many more.

Secure apps and data

Azure adheres to the industry-best protocols of the data encryption in transition – Data travels between devices and Microsoft datacenters, within datacenters, as well as when the data is at rest in Azure Storage. Security is ensured by encryption for data, files, applications, services, communications, and drives.

Another Data security features in Azure

We can also encrypt our data before pushing it into Azure, in addition,  ensure key security from on premises data centers.


Hopefully, you have understood the basics of Microsoft Cloud (Azure) Security. This is only the basics; you can get extensive knowledge by reading the Microsoft Azure website ( and get the latest information about Azure/Cloud Security. I would love to keep on sharing the Microsoft Technology stuff with you. Next time, I will discuss ‘Advanced Security with Microsoft Azure’.

Until next time, Happy Coding and Keep Improving!!

Learn Azure Cloud Storage & Data Classification and Prediction using Azure Machine Learning

Learn Azure Cloud Storage & Data Classification and Prediction using Azure Machine Learning
Sat, May 27, 2017 10:30 AM – 12:00 PM CDT

Please join my meeting from your computer, tablet or smartphone.

You can also dial in using your phone.
United States: +1 (872) 240-3412

Access Code: 260-753-421

First GoToMeeting? Try a test session:

Azure IaaS as a *Starting Point* on your Cloud Journey

In this article , we will discuss how to leverage business with Azure IaaS (Infrastructure as a Service). We will take a comparative look between Traditional Model vs. Cloud Model.

If you have quesitons about the clous Cloud , please look over the following:

I am sure by the end of this article, we will have a sound rationale for using Cloud Services

Azure As Starting Point

As shown above , in the Traditional on-premise Service Model we have to take care of the Application, Data, Runtime, Server, Storage etc.

But we could use either IaaS, PaaS or SaaS based on individual and business usage

Pay per use


For Platform as a Service (PaaS), we can benefit from one or all of the Security & Management, Platform Services or Infrastructure Services.

Worldwide Azure Regions Availability


As of today, there are a total of 34 Azure regions to give support 24/7 to customers in almost every continent.

Geo Redundancy


Highly promising Geo replications options are available across the globe for Disaster Recovery.


It would not be wrong to say that the data is highly secure in the Cloud, owing to the Security mechanisms at the levels of:

  1. Physical
  2. Infra
  3. Network and
  4. VM

It is shown below.


Azure Security Center

For details, please refer to this related article:

Azure Security – Solution To Digital Transformation

Azure security center is available to keep an eye out every time for Cloud Resources, as shown below.


Now, let’s dive deep down into IaaS Core Services.

It’s a combination of the following

  1. Compute
  2. Storage
  3. Networking and
  4. Managementimage007Options for the Compute Families9.pngAzure Storage,For more details, please refer to my Azure Storage series from 1 to 7,

Introduction To Azure Data Lake


In this blog, we will walk through Azure Data Lake Store feature. Azure Data Lake Store is ‘Generally Available’ from Nov 2016 and is among the fastest emerging Azure Service.


In my past Azure articles, we have learned about how to create virtual machines, Data Warehouse and Azure app Service as a platform-as-a-service (PaaS) subscription from Microsoft Azure. In case you did not get a chance to walk through, please first read the articles, mentioned below.

Read More

After Azure Data Lake Store public view availability, it has grown a lot because of the incredible offerings.

What is Azure Data Lake?

Azure Data Lake offering is not limited/restricted to data size, type, platform and features. However, it has been introduced with the supporting features of the batch, streaming, interactive analytics etc. Now, data scientists, data analysts and the data developers can store any size of data, shape, and with considerably faster speed. Azure Data Lake (ADL) not only makes it possible to store Big Data but also offers good services, mentioned below.

  • Azure Data Lake Store
  • Azure Data Lake Analytics
  • Azure HDInsight

Azure Data Lake Store (DLS)

DLS is a no-limits data lake to power intelligent action. With DLS, we can store trillions of files. Hence, we can say that DLS is most suited to the security Server data, large audio/video (e.g Youtube) and Security Insurance Data of the whole country.

Data Lake Store scales any size of data and it can provide massive output to run analytic jobs with thousands of concurrent users that read and write hundreds of terabytes of the data efficiently.

Data Lake Store protects the data as the data is always encrypted by adhering to the security and regulatory compliance. For more details of Azure Security, read more at.

Now, please login to your Azure Account at

Click on Azure –More Services ->Data Storage and then Data Lake Store.


Furthermore, Microsoft Azure Data Lake Store supports any Application that uses the open Apache Hadoop Distributed File System (HDFS). By supporting HDFS, we can easily migrate your existing Hadoop and Spark data to the Cloud.

Azure Data Lake Analytics (DLA)

An on-demand analytics job Service is required to power an intelligent action. With DLA, ease to improve the scalability of the database increased tremendously. Scalability can be increased in the minutes and we have to pay for what we use.

DLA has a massive support from Parallel system and ‘PETABYTES’ of the data, as it can be processed easily for different categories. In addition to it, we can count on the enhanced SSO (Single Sign ON), multifactor authentication.

Azure HDInsightPIC2

HDInsight offers Hadoop Service for enterprise. It has reliable open source analytics, architecture for full redundancy, and data geo-replication. HDInsight also supports SSO etc., which are a few among many. Azure machine type enables the utilization of the resources and we only have to pay for the computing and storage.

In addition, HDInsight has a good support of the integration with independent software vendors (ISVs).
A recent study showed HDInsight delivered 63% lower TCO than deploying Hadoop on-premises and industry’s best 99.9% SLA and 24/7 support. (Above figure is heavily borrowed from Microsoft Azure Site)


Nowadays, it’s imperative to look over the cloud very seriously. Azure Data Lake is among the fastest evolving services and will be a contributing factor for any cloud based enterprise solution.