Category Archives: April 2018

Azure Migration to Successful Migration   

Deepak Kaushik (Microsoft Azure MVP)

When migrating into Azure you should consider few things that must define or result to a successful migration.

  1. Lower your TCO (Total Cost of Ownership) more than 60%
  2. Reduce Time to Market (Scope / Release an MVP (Minimum Viable Product))
  3. Hybrid environment
  4. Secure security for your hybrid environment (SSO=Single sign on, Azure IAM solution)
  5. Reuse/extend your on-premises licensing in Azure.
  6. Use the new features in each Azure service and optimize your data within the cloud.

Let’s mention some of the way in migrating your data into Azure.

  1. Storage Services

Azure has numerous ways in storing your on-premises data, but the main question is what kind of data are they? Are they archive data, or transactional data? BI data? What format are they in? file/DB? How is the data moving around? Transactional/Incremental/… as you can see each set of data have a different nature that needs to be treated differently and for that Microsoft has a variety of Azure services as mentioned.

  • Azure Blob Storage (Cold Storage)             (Archiving data)
  • Azure Blob Storage (Hot Storage)               (Streaming data)
  • Azure NetApps Files                                       (File Storage)
  • Azure SQL Database                                       (Transactional database)
  • Azure Cosmos DB                                         (Geo distribution data)
  • ●      Azure Data base For PostgreSQL
  • Azure Data base For MySQL
  • … and many, many more.
  • Transfer Services

One of the most famous services that migrates your on-premises data to the cloud is Azure Database Migration Services, this service can migrate any well knows database software/application to the Azure cloud, it also can migrate your data base in a offline or online approach (minimal downtime).

Also they are numerous ways in migrating to Azure using different services like..

  • Azure Migrate: Server Migration
  • Data Migration Assistant
  • Azure Database Migration Service
  • Web app migration assistant
  • Azure Data Box
  • Azure Data Factory
  • Azure CLI (Command Line).
  • Azure PowerShell.
  • AZCopy utility
  • Third part migration tools certified by Microsoft.
  • On-prem tools, for example SSIS.
  • Analytics and Big Data

The definition of Analytics in the data world is to have the analytics team deal with the entire data, that leads them in dealing with big data and running/process/profiling massive amount of data and for that Microsoft have provided a variety of tools depending on the analytics team needs, or the type/volume of data, some of the most well knows analytics tools within the azure are as mentioned and some of them have embedded internal ETL tools.

  1. Azure Synapse Analytics (formally knows as Azure SQL DW)
  2. Azure Data Explorer (know as ADX)
  3. AZ HD Insight
  4. Power BI
  5. … and many, many more.
  • Azure Migrate Documentation

You might be looking at the definition of migration from a different angle, it may have a different meaning like migrating VM, SQL configuration and other on-premises services, take a look at Azure Migrate Appliance under the Azure Migration documentation. 

Choose an Azure solution for data transfer.

Check out some of Microsoft’s data transfer solution, in this link (click here) you will find few scenario that can help you to understand the existing data migration approaches.


Migrating to Azure is very simple but needs planning, consistency and basic Azure knowledge. You may have a very successful migration, but you need to make sure that the new features in azure services are been used as needed, and finally Microsoft always has a solution for you.

Experience Global Power Platform Bootcamp 2021 on February 19 – 20, 2021

So happy to contribute as volunteer on ‘Global Power Platform Bootcamp 2021’ with my friends!!


Thanks everyone..

Glad to be C# Corner Speakers Of Year 2020

I am so glad to be featured as one of the top 10 speaker of ‘Year 2020’ on C# Corner

C# Corner organized 197 events under 46 chapters from around the world. These events include in-person events, webinars, and live shows 

 Thank you, chapter leaders, speakers, and members, for the success of these events. 

Webinar : KQL: Let’s be Kusto Query Language Ninja

Kusto is a service for storing and running interactive analytics over Big Data based on Microsoft Azure infrastructure.

Thanks everyone to attend the Webinar on Dec 12th, 2020.

Date And Time Sat, December 12, 2020 9:00 AM – 10:00 AM CST

Here is recording:

About this Event


Getting Started with Kusto

Kusto Explorer

Entity Types

Query Statements

Join here:


Deepak Kaushik AND Shahriar Nikkhah


Deepak is a Microsoft Azure MVP and C# corner MVP. He is currently working on architecting and building solutions around Microsoft Azure. He is passionate about technology and comes from a development background. He has also led various projects in the Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). Deepak recently worked with Public Sector client to migrate ‘On-Prem’ DWH to Azure SQL Data warehouse and reports migration from SSRS reports to POWER BI. His breadth and depth of knowledge have enabled him to lead the development of various products/solutions around Microsoft Azure. Deepak is a knowledgeable and sought-after speaker within IT circles and consulted regularly by companies formulating their Cloud strategies.Deepak founded Regina/Saskatchewan ‘C# Corner Chapter/ User Group’ in 2016 in the Regina/ Saskatchewan area where we organizes regular meetups and webinars!!You could find his Sessions/ Recordings at – Channel9, C# Corner, Blog and Deepak Kaushik -Microsoft MVP YouTube Channel.


Snr Technical Team Lead/Senior BI Consultant, Big Data, IoT Hub, Azure Stream Analytics, Power BI, SSIS, SQL Tabular DM, MCSA: BI Reporting, MCSE: Data Management and Analytics SQL2016, MCSA: SQL 2016 Database Development, MCP, Microsoft Certified Professional (SQL2012), MCPS: Microsoft Certified Professional, MCITP, Database Developer 2008, MCTS SQL 2008 Database Development, MCTS SQL 2008 Implementation and Maintenance, MCTS SQL 2005********Career Objective To obtain a senior level Business Intelligence Designer/Developer/Lead position (SQLServer, Big data, Stream Analytics, IoT Hub, SSIS, Power BI, SSAS, SSRS, Datazen, Tableau, Power Pivot, Power View, etc…).********Experience summary Focused on Big data, Data Warehouse, BI DW design, Business Intelligence Remote team lead and work experience, in Canada, US and EU for BI and ETL design teams. Senior level experience in Transactional DW and BI DW design projects. Coaching, mentoring, Leading, providing BI presentation and BI solutions for, end users, IT teams and management teams********Technical skills Database: SQL Server 2017 down to 7.0, Oracle 9.x, Teradata, Azure Tabular Model, SSAS 2008-2012-2014, SQL Tabular (BISM), Access. BI Tools: SSAS, SSIS, Power Query, PowerPivot, Power View, Power Map, Power BI, Analytic Tools, SSRS, SharePoint, Excel Pivot tables, Datazen Publisher, Tableau. Languages: T-SQL, M Language, DAX, MDX, PowerShell, VB.Net, C#. Other: VS, Cloud Base Technologies: SQL Azure, Power BI Desktop/Design, Power BI Portal/Desktop, O365, Azure Data Factory (ADF), Datazen Publisher, Azure Resource Group, Azure Data Gateway, Data Management Gateway, oAuth, Azure PowerShell.********Certification, Awards, Publication, Contribution MCSA: BI Reporting — Certified 2018 MCSE: Data Management and Analytics — Certified 2018 MCSA: SQL 2016 Database Development – Certified 2018 Power BI forum Contributor at (Click on me for my id, “SNik”) MCP (MS-Certified Professional) SQL 2012, (See attached Microsoft CP Transcript) Co-authors of “SQL MVP Deep Dive 2” 2011 (Support the children of Operation Smile) Microsoft SQL MVP 2010 Award (Microsoft Most Valuable Professional) MCITP (MS-Certified IT Professional), MCTS (MS-Certified Technology Specialist) 4 Star rating at “SQL Server Integration Service (SSIS)” forum online at

C# Corner MVP 2020-2021

I’m so Thankful, honored, and excited to receive/ renewed C# Corner Most Valuable Professional (MVP) Award for the 2020-2021 for 4th time!!

‪I am sincerely humbled to be part of such an incredible CSharpCorner community!

This year I have talked at 11 conferences & User Groups where approx. 4000+ people attended, wrote many blogs & articles at my website and different technical portals!

In addendum, I talked at 10+ webcasts / podcasts / Live Interviews and reached to 3000+ attendees, Made a lot of #Azure Videos, mentored couple of students/ professionals on #Azure / suggestion / advice / resources and #saskatchewan Job Market insights!

Again, Many Thanks CSharpCorner Team!

Microsoft MVP Renewal 2020-2021

Wohoo 🙌 !!

I’m so Thankful, honored, and excited to receive/ renewed Microsoft Most Valuable Professional (MVP) Award for the 2020-2021 in #AZURE Category for 3rd time!!

‪I am sincerely humbled to be part of such an incredible community, celebrating #CanadaDay while being recognized for my contributions!!

This year I have talked at 11 🌎 conferences & User Groups where approx. 4000+ people attended, wrote many blogs & articles at my website and different technical portals!

In addendum, I talked at 10+ webcasts / podcasts / Live Interviews and reached to 3000+ attendees, Made a lot of #Azure Videos, mentored couple of students/ professionals on #Azure / suggestion / advice / resources and #saskatchewan Job Market insights!!

Again Many Thanks 🎉👏


This is my 3rd MVP Award and I am very grateful and appreciative for this honor and for the various opportunities provided to me over time.

Thank you very much to each and every one of you for making me successful in my efforts as a MVP, IT Professional, and community contributor, and for providing me with the valuable resources and networking opportunities. Thank you!

Azure -Disaster Recovery for SAP HANA Systems

Microsoft Azure provides a trusted path to enterprise-ready innovation with SAP solutions in the cloud. Mission critical applications such as SAP run reliably on Azure, which is an enterprise proven platform offering hyperscale, agility, and cost savings for running a customer’s SAP landscape.

System availability and disaster recovery are crucial for customers who run mission-critical SAP applications on Azure.

RTO and RPO are two key metrics that organizations consider in order to develop an appropriate disaster recovery plan that can maintain business continuity due to an unexpected event. 

Recovery point objective refers to the amount of data at risk in terms of “Time” whereas Recovery Time Objective refers to the amount of time or the maximum tolerable time that system can be down after disaster occurs.

The below diagram gives a view of RPO and RTO on a timeline view in a business as usual (BAU) scenario.

Design principles for disaster recovery systems

  • Selection of DR Region based on SAP Certified VMs for SAP HANA – It is important to verify the availability of SAP Certified VMs types in DR Region.
  • RPO and RTO Values – Businesses need to lay out clear expectations in RPO and RTO values which greatly affect the architecture for Disaster Recovery and requirements of tools and automation required to implement Disaster Recovery
    • Cost of Implementing DR, Maintenance and DR Drills
    • Criticality of systems – It is possible to establish Trade-off between Cost of DR implementation and Business Requirements. While most critical systems can utilize state of the art DR architecture, medium and less critical systems may afford higher RPO/RTO values.
    • On Demand Resizing of DR instances – It is preferable to use small size VMs for DR instances and upsize those during active DR scenario. It is also possible to reserve the required capacity of VMs at DR region so that there is no “waiting” time to upscale the VMs.
    • Additional considerations for cloud infrastructure costs, efforts in setting up environment for Non-disruptive DR Tests. Non-disruptive DR Tests refers to executing DR Tests without performing failover of actual productive systems to DR systems thereby avoiding any business downtimes. This involves additional costs for setting up temporary infrastructure which is in completely isolated vNet during the DR Tests.
    • Certain components in SAP systems architecture such as clustered network file system (NFS) which are not recommended to be replicated using Azure Site Recovery, hence there is a need for additional tools with license costs such as SUSE Geo-cluster or SIOS Data keeper for NFS Layer DR.
  • Azure offers “Azure Site Recovery (ASR)” which replicates the virtual machines across the region, this technology is used at non-database components or layers of the system while database specific methods such as SAP HANA system replication (HSR) are used at database layer to ensure consistency of databases.

Disaster recovery architecture for SAP systems running on SAP HANA Database

At a very high level, the below diagram depicts the architecture of SAP systems based on SAP HANA and which systems will be available in case of local or regional failures.

The diagram below gives next level details of SAP HANA systems components and corresponding technology used for achieving disaster recovery.

Steps for invoking DR or a DR drill

Microsoft Azure Site Recovery (ASR) helps in faster replication of data at the DR region.

Steps for Invoking DR or a DR drill:

  • DNS Changes for VMs to use new IP addresses
  • Bring up iSCSI – single VM from ASR Replicated data
  • Recover Databases and Resize the VMs to required capacity
  • Manually provision NFS – Single VM using snapshot backups
  • Build Application layer VMs from ASR Replicated data
  • Perform cluster changes
  • Bring up applications
  • Validate Applications
  • Release systems

A screenshot of an example DR drill plan.


Azure keeps your applications up and running and your data available. Azure is the first cloud platform to provide a built-in backup and disaster recovery solution.

Resiliency is not about avoiding failures but responding to failures. The objective is to respond to failure in a way that avoids downtime and data loss. Business continuity and data protection are critical issues for today’s organizations, and business continuity is built on the foundation of resilient systems, applications, and data.

Reliability and resiliency are closely related. Reliability is defined as dependability and performing consistently well. Resiliency is defined as the capacity to recover quickly. Together, these two qualities are key to a trustworthy cloud service. Despite best efforts, disasters happen; they are inevitable but mostly unpredictable, and vary in type and magnitude. There is almost never a single root cause of a major issue. Instead, there are several contributing factors, which is the reason an issue is able to circumvent various layers of mitigation/defenses.