Category Archives: April 2018

Experience Global Power Platform Bootcamp 2021 on February 19 – 20, 2021

So happy to contribute as volunteer on ‘Global Power Platform Bootcamp 2021’ with my friends!!

#GlobalPowerPlatformBootcamp#GPPB2021#PowerPlatform

Thanks everyone..

Glad to be C# Corner Speakers Of Year 2020

I am so glad to be featured as one of the top 10 speaker of ‘Year 2020’ on C# Corner

https://www.c-sharpcorner.com/article/top-10-c-sharp-corner-speakers-of-year-2020/

C# Corner organized 197 events under 46 chapters from around the world. These events include in-person events, webinars, and live shows 

 Thank you, chapter leaders, speakers, and members, for the success of these events. 

Webinar : KQL: Let’s be Kusto Query Language Ninja

Kusto is a service for storing and running interactive analytics over Big Data based on Microsoft Azure infrastructure.

Thanks everyone to attend the Webinar on Dec 12th, 2020.


Date And Time Sat, December 12, 2020 9:00 AM – 10:00 AM CST

Here is recording:

About this Event

AGENDA

Getting Started with Kusto

Kusto Explorer

Entity Types

Query Statements

Join here:

https://teams.microsoft.com/l/meetup-join/19%3a1df1264620c04b6084f000557

SPEAKERS

Deepak Kaushik AND Shahriar Nikkhah

DEEPAK KAUSHIK @THINKFORDEEPAK

Deepak is a Microsoft Azure MVP and C# corner MVP. He is currently working on architecting and building solutions around Microsoft Azure. He is passionate about technology and comes from a development background. He has also led various projects in the Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). Deepak recently worked with Public Sector client to migrate ‘On-Prem’ DWH to Azure SQL Data warehouse and reports migration from SSRS reports to POWER BI. His breadth and depth of knowledge have enabled him to lead the development of various products/solutions around Microsoft Azure. Deepak is a knowledgeable and sought-after speaker within IT circles and consulted regularly by companies formulating their Cloud strategies.Deepak founded Regina/Saskatchewan ‘C# Corner Chapter/ User Group’ in 2016 in the Regina/ Saskatchewan area where we organizes regular meetups and webinars!!You could find his Sessions/ Recordings at https://channel9.msdn.com – Channel9, C# Corner, Blog and Deepak Kaushik -Microsoft MVP YouTube Channel.

SHAHRIAR NIKKHAH C# CORNER MVP

Snr Technical Team Lead/Senior BI Consultant, Big Data, IoT Hub, Azure Stream Analytics, Power BI, SSIS, SQL Tabular DM, MCSA: BI Reporting, MCSE: Data Management and Analytics SQL2016, MCSA: SQL 2016 Database Development, MCP, Microsoft Certified Professional (SQL2012), MCPS: Microsoft Certified Professional, MCITP, Database Developer 2008, MCTS SQL 2008 Database Development, MCTS SQL 2008 Implementation and Maintenance, MCTS SQL 2005********Career Objective To obtain a senior level Business Intelligence Designer/Developer/Lead position (SQLServer, Big data, Stream Analytics, IoT Hub, SSIS, Power BI, SSAS, SSRS, Datazen, Tableau, Power Pivot, Power View, etc…).********Experience summary Focused on Big data, Data Warehouse, BI DW design, Business Intelligence Remote team lead and work experience, in Canada, US and EU for BI and ETL design teams. Senior level experience in Transactional DW and BI DW design projects. Coaching, mentoring, Leading, providing BI presentation and BI solutions for, end users, IT teams and management teams********Technical skills Database: SQL Server 2017 down to 7.0, Oracle 9.x, Teradata, Azure Tabular Model, SSAS 2008-2012-2014, SQL Tabular (BISM), Access. BI Tools: SSAS, SSIS, Power Query, PowerPivot, Power View, Power Map, Power BI, Analytic Tools, SSRS, SharePoint, Excel Pivot tables, Datazen Publisher, Tableau. Languages: T-SQL, M Language, DAX, MDX, PowerShell, VB.Net, C#. Other: VS, Cloud Base Technologies: SQL Azure, Power BI Desktop/Design, Power BI Portal/Desktop, O365, Azure Data Factory (ADF), Datazen Publisher, Azure Resource Group, Azure Data Gateway, Data Management Gateway, oAuth, Azure PowerShell.********Certification, Awards, Publication, Contribution MCSA: BI Reporting — Certified 2018 MCSE: Data Management and Analytics — Certified 2018 MCSA: SQL 2016 Database Development – Certified 2018 Power BI forum Contributor at http://www.community.powerbi.com (Click on me for my id, “SNik”) MCP (MS-Certified Professional) SQL 2012, (See attached Microsoft CP Transcript) Co-authors of “SQL MVP Deep Dive 2” 2011 (Support the children of Operation Smile) Microsoft SQL MVP 2010 Award (Microsoft Most Valuable Professional) MCITP (MS-Certified IT Professional), MCTS (MS-Certified Technology Specialist) 4 Star rating at “SQL Server Integration Service (SSIS)” forum online at msdn.Microsoft.com

C# Corner MVP 2020-2021

I’m so Thankful, honored, and excited to receive/ renewed C# Corner Most Valuable Professional (MVP) Award for the 2020-2021 for 4th time!!

‪I am sincerely humbled to be part of such an incredible CSharpCorner community!

This year I have talked at 11 conferences & User Groups where approx. 4000+ people attended, wrote many blogs & articles at my website and different technical portals!

In addendum, I talked at 10+ webcasts / podcasts / Live Interviews and reached to 3000+ attendees, Made a lot of #Azure Videos, mentored couple of students/ professionals on #Azure / suggestion / advice / resources and #saskatchewan Job Market insights!

Again, Many Thanks CSharpCorner Team!

Microsoft MVP Renewal 2020-2021

Wohoo 🙌 !!

I’m so Thankful, honored, and excited to receive/ renewed Microsoft Most Valuable Professional (MVP) Award for the 2020-2021 in #AZURE Category for 3rd time!!

‪I am sincerely humbled to be part of such an incredible community, celebrating #CanadaDay while being recognized for my contributions!!

This year I have talked at 11 🌎 conferences & User Groups where approx. 4000+ people attended, wrote many blogs & articles at my website and different technical portals!

In addendum, I talked at 10+ webcasts / podcasts / Live Interviews and reached to 3000+ attendees, Made a lot of #Azure Videos, mentored couple of students/ professionals on #Azure / suggestion / advice / resources and #saskatchewan Job Market insights!!

Again Many Thanks 🎉👏

#MVPBuzz#MVPAward#CdnMVP#MVP#MicrosoftMVP

This is my 3rd MVP Award and I am very grateful and appreciative for this honor and for the various opportunities provided to me over time.

Thank you very much to each and every one of you for making me successful in my efforts as a MVP, IT Professional, and community contributor, and for providing me with the valuable resources and networking opportunities. Thank you!

Azure -Disaster Recovery for SAP HANA Systems

Microsoft Azure provides a trusted path to enterprise-ready innovation with SAP solutions in the cloud. Mission critical applications such as SAP run reliably on Azure, which is an enterprise proven platform offering hyperscale, agility, and cost savings for running a customer’s SAP landscape.

System availability and disaster recovery are crucial for customers who run mission-critical SAP applications on Azure.

RTO and RPO are two key metrics that organizations consider in order to develop an appropriate disaster recovery plan that can maintain business continuity due to an unexpected event. 

Recovery point objective refers to the amount of data at risk in terms of “Time” whereas Recovery Time Objective refers to the amount of time or the maximum tolerable time that system can be down after disaster occurs.

The below diagram gives a view of RPO and RTO on a timeline view in a business as usual (BAU) scenario.

Design principles for disaster recovery systems

  • Selection of DR Region based on SAP Certified VMs for SAP HANA – It is important to verify the availability of SAP Certified VMs types in DR Region.
  • RPO and RTO Values – Businesses need to lay out clear expectations in RPO and RTO values which greatly affect the architecture for Disaster Recovery and requirements of tools and automation required to implement Disaster Recovery
    • Cost of Implementing DR, Maintenance and DR Drills
    • Criticality of systems – It is possible to establish Trade-off between Cost of DR implementation and Business Requirements. While most critical systems can utilize state of the art DR architecture, medium and less critical systems may afford higher RPO/RTO values.
    • On Demand Resizing of DR instances – It is preferable to use small size VMs for DR instances and upsize those during active DR scenario. It is also possible to reserve the required capacity of VMs at DR region so that there is no “waiting” time to upscale the VMs.
    • Additional considerations for cloud infrastructure costs, efforts in setting up environment for Non-disruptive DR Tests. Non-disruptive DR Tests refers to executing DR Tests without performing failover of actual productive systems to DR systems thereby avoiding any business downtimes. This involves additional costs for setting up temporary infrastructure which is in completely isolated vNet during the DR Tests.
    • Certain components in SAP systems architecture such as clustered network file system (NFS) which are not recommended to be replicated using Azure Site Recovery, hence there is a need for additional tools with license costs such as SUSE Geo-cluster or SIOS Data keeper for NFS Layer DR.
  • Azure offers “Azure Site Recovery (ASR)” which replicates the virtual machines across the region, this technology is used at non-database components or layers of the system while database specific methods such as SAP HANA system replication (HSR) are used at database layer to ensure consistency of databases.

Disaster recovery architecture for SAP systems running on SAP HANA Database

At a very high level, the below diagram depicts the architecture of SAP systems based on SAP HANA and which systems will be available in case of local or regional failures.

The diagram below gives next level details of SAP HANA systems components and corresponding technology used for achieving disaster recovery.

Steps for invoking DR or a DR drill

Microsoft Azure Site Recovery (ASR) helps in faster replication of data at the DR region.

Steps for Invoking DR or a DR drill:

  • DNS Changes for VMs to use new IP addresses
  • Bring up iSCSI – single VM from ASR Replicated data
  • Recover Databases and Resize the VMs to required capacity
  • Manually provision NFS – Single VM using snapshot backups
  • Build Application layer VMs from ASR Replicated data
  • Perform cluster changes
  • Bring up applications
  • Validate Applications
  • Release systems

A screenshot of an example DR drill plan.

Resiliency/Reliability:

Azure keeps your applications up and running and your data available. Azure is the first cloud platform to provide a built-in backup and disaster recovery solution.

Resiliency is not about avoiding failures but responding to failures. The objective is to respond to failure in a way that avoids downtime and data loss. Business continuity and data protection are critical issues for today’s organizations, and business continuity is built on the foundation of resilient systems, applications, and data.

Reliability and resiliency are closely related. Reliability is defined as dependability and performing consistently well. Resiliency is defined as the capacity to recover quickly. Together, these two qualities are key to a trustworthy cloud service. Despite best efforts, disasters happen; they are inevitable but mostly unpredictable, and vary in type and magnitude. There is almost never a single root cause of a major issue. Instead, there are several contributing factors, which is the reason an issue is able to circumvent various layers of mitigation/defenses.

Building Azure Monitoring, Logging and Alerting Foundation for SAP application

Introduction

Migrating SAP systems to Azure, Microsoft fine-tuned it’s capacity management processes, minimizing downtime, risk, and costs and improving employee efficiencies. Optimizing on Azure allows us to design an SAP environment that is agile, efficient, and flexible to grow and change with our business. need (least-privileged).

As we decided to migrate your SAP systems to Azure. It’s a big move, and taking the right steps can make the transition smooth and manageable. IoTCoast2Coast took a measured approach to moving most sensitive data and confidential workloads with SAP systems.

The right approach makes it possible to migrate mission-critical SAP systems to Azure, gaining maximum cost savings, scalability, and agility, without disrupting business operations. Our horizontal strategy meant moving low-risk environments like our sandboxes first, giving us experience with Azure migration without risking critical business functions in the process. Using a vertical strategy to move entire low-impact systems gave us experience with Azure production processes.

Prerequisite

To configure Azure AD integration with SAP Cloud Platform, you need the following items:

  1. Azure Subscription
  2. Basic Azure knowledge
  3. An Azure AD tenant
  4. SAP Cloud Platform Identity Authentication tenant
  5. A user account in SAP Cloud Platform Identity Authentication with Admin permissions.
  6. An Azure AD subscription. If you don’t have an Azure AD environment, you can get one-month trial here
  7. SAP Cloud Platform single sign-on enabled subscription

Definition

Throughout the document, these terms are used:

IaaS: Infrastructure as a service.

PaaS: Platform as a service.

SaaS: Software as a service.

Creating the best SAP environment with Azure

Azure is the preferred platform for SAP. As the top SAP certified cloud provider, Azure able to reliably run mission critical SAP environment on a trusted cloud platform built for enterprises. Azure meets scalability, flexibility, and compliance needs.

Azure can run the most complete set of SAP applications across dev-test and production scenarios in Azure—and be fully supported. Azure is certified for more SAP solutions than any other cloud provider, including solutions like SAP HANA and S/4 HANA, SAP Business Suite, SAP NetWeaver, and SAP Business One to name a few.

Azure also carries a large number of benefits when hosting the SAP platform, including:

Creating a telemetry solution for SAP on Azure

The distributed nature of our business process environment led us to examine a broader solution—one that would provide comprehensive telemetry and monitoring for our SAP landscape, but also for any other business processes that comprised the end-to-end business landscape at Microsoft. Our implementation was driven by the following important goals:

Goals and drivers

Microsoft developed a telemetry platform in Azure called as the Unified Telemetry Platform (UTP). UTP is a modern, scalable, reliable, and cost-effective telemetry platform that’s used in several different business process monitoring scenarios in Microsoft, including our SAP-related business processes.

UTP is built to enable service maturity and business process monitoring across CSEO. It provides a common telemetry taxonomy and integration with core Microsoft data monitoring services. UTP enables compliance and the maintenance of business standards for data integrity and privacy. While UTP is the implementation we chose, there are numerous ways to enable telemetry on Azure.

Capturing telemetry with Azure Monitor

To enable business-driven monitoring and a user-centric approach, UTP captures as many of the critical events within the end-to-end process landscape as possible. Embracing comprehensive telemetry in our systems meant capturing data from all available endpoints to build an understanding of how each process flowed and which of the SAP components were involved. Azure Monitor and its related Azure services serve as the core for our solution.

Azure Application Insights

Application Insights provides an Azure-based solution with which we can dig deep into our Azure-hosted SAP landscape and pull out all necessary telemetry data. Using Application insights, we can automatically generate alerts and support tickets when our telemetry indicates a potential error situation.

Azure Log Analytics

Infrastructure telemetry such as CPU usage, disk throughput and other performance-related data is collected from Azure infrastructure components in the SAP environment using Log Analytics.

Azure Data Explorer

UTP uses Azure Data Explorer as the central repository for all telemetry data sent through Application Insights and Azure Monitor Logs from our application and infrastructure environment. Azure Data Explorer provides enterprise big data interactive analytics; we use the Kusto query language to stitch together the end-to-end transaction flow for our business processes, for both SAP process and non-SAP processes.

Azure Data Lake

UTP uses Azure Data Lake for long-term cold data storage. This data is taken out of the hot and warm streams and kept for reporting and archival purposes in Azure Data Lake to reduce the cost associated with storing large amounts of data in Azure Monitor.

Implementing UTP in SAP on Azure

The first step in enabling our telemetry platform was to create a reusable custom method and configuration table to drive consistent creation of the telemetry payloads. The configuration table defines the fixed structure of the payload according to the UTP standards.

The method then allows the calling application to pass an application-specific payload to populate the dynamic properties section of the telemetry events payload, and then adds SAP standard elements such as the event date and time, and system identifier. This method can then be called directly from any ABAP code, in either synchronous or asynchronous modes.

For example, in most business processes in our ERP, we use SAP business process events to trigger our telemetry events. The business process events share a custom check routine framework built using SAP Business Rule Framework plus; then custom receiver classes build the dynamic properties of the payload and call the shared telemetry class.

When each event in the workflow is processed in SAP, the JSON payload is passed to Application Insights using an external REST service call, which connects to the UTP framework. The following figure contains an example from our non-delivery order-to-cash process.

Azure Monitor

Azure Monitor maximizes the availability and performance of your applications and services by delivering a comprehensive solution for collecting, analyzing, and acting on telemetry from your cloud and on-premises environments.

Azure Monitor include:

The below diagram gives a high-level view of Azure Monitor. At the center of the diagram are the data stores for metrics and logs, which are the two fundamental types of data use by Azure Monitor.

On the left are the sources of monitoring data that populate these data stores. On the right are the different functions that Azure Monitor performs with this collected data such as analysis, alerting, and streaming to external systems.

Monitoring data platform:

All data collected by Azure Monitor fits into one of two fundamental types, metrics and logs. Metrics are numerical values that describe some aspect of a system at a particular point in time. They are lightweight and capable of supporting near real-time scenarios. Logs contain different kinds of data organized into records with different sets of properties for each type. Telemetry such as events and traces are stored as logs in addition to performance data so that it can all be combined for analysis.

Log data collected by Azure Monitor can be analyzed with queries to quickly retrieve, consolidate, and analyze collected data. You can create and test queries using Log Analytics in the Azure portal and then either directly analyze the data using these tools or save queries for use with visualizations or alert rules.

Azure Monitor uses a version of the Kusto query language used by Azure Data Explorer that is suitable for simple log queries but also includes advanced functionality such as aggregations, joins, and smart analytics. You can quickly learn the query language using multiple lessons.

Data collected by Azure Monitor ?   

Azure Monitor can collect data from a variety of sources. You can think of monitoring data for your applications in tiers ranging from your application, any operating system and services it relies on, down to the platform itself. Azure Monitor collects data from each of the following tiers:

Azure Insights:

Azure Monitoring data is only useful if it can increase your visibility into the operation of your computing environment. Azure Monitor includes several features and tools that provide valuable insights into your applications and other resources that they depend on. Monitoring solutions and features such as Application Insights and Azure Monitor for containers provide deep insights into different aspects of your application and specific Azure services.

Application Insights

Application Insights monitors the availability, performance, and usage of your web applications whether they’re hosted in the cloud or on-premises. It leverages the powerful data analysis platform in Azure Monitor to provide you with deep insights into your application’s operations and diagnose errors without waiting for a user to report them. Application Insights includes connection points to a variety of development tools and integrates with Visual Studio to support your DevOps processes.

Azure Lighthouse

Azure Lighthouse offers service providers a single control plane to view and manage Azure across all their customers with higher automation, scale, and enhanced governance. With Azure Lighthouse, service providers can deliver managed services using comprehensive and robust management tooling built into the Azure platform. This offering can also benefit enterprise IT organizations managing resources across multiple tenants.

Benefits

Azure Lighthouse helps you to profitably and efficiently build and deliver managed services for your customers. The benefits include:

  • Management at scale: Customer engagement and life-cycle operations to manage customer resources are easier and more scalable.
  • Greater visibility and precision for customers: Customers whose resources you’re managing will have greater visibility into your actions and precise control over the scope they delegate for management, while your IP is preserved.
  • Comprehensive and unified platform tooling: Our tooling experience addresses key service provider scenarios, including multiple licensing models such as EA, CSP and pay-as-you-go. The new capabilities work with existing tools and APIs, licensing models, and partner programs such as the Cloud Solution Provider program (CSP). The Azure Lighthouse options you choose can be integrated into your existing workflows and applications, and you can track your impact on customer engagements by linking your partner ID.

There are no additional costs associated with using Azure Lighthouse to manage your customers’ Azure resources.

Capabilities

Azure Lighthouse includes multiple ways to help streamline customer engagement and management:

  • Azure delegated resource management: Manage your customers’ Azure resources securely from within your own tenant, without having to switch context and control planes. For more info, see Azure delegated resource management.
  • New Azure portal experiences: View cross-tenant info in the new My customers page in the Azure portal. A corresponding Service providers blade lets your customers view and manage service provider access. For more info, see View and manage customers and View and manage service providers.
  • Azure Resource Manager templates: Perform management tasks more easily, including onboarding customers for Azure delegated resource management. For more info, see our samples repo and Onboard a customer to Azure delegated resource management.
  • Managed Services offers in Azure Marketplace: Offer your services to customers through private or public offers, and have them automatically onboarded to Azure delegated resource management, as an alternate to onboarding using Azure Resource Manager templates. For more info, see Managed services offers in Azure Marketplace.
  • Azure managed applications: Package and ship applications that are easy for your customers to deploy and use in their own subscriptions. The application is deployed into a resource group that you access from your tenant, letting you manage the service as part of the overall Azure Lighthouse experience. For more info, see Azure managed applications overview.

Azure Monitor Logs

Azure Monitor stores log data in a Log Analytics workspace, which is an Azure resource and a container where data is collected, aggregated, and serves as an administrative boundary. While you can deploy one or more workspaces in your Azure subscription, there are several considerations you should understand in order to ensure your initial deployment is following our guidelines to provide you with a cost effective, manageable, and scalable deployment meeting your organizations needs.

Data in a workspace is organized into tables, each of which stores different kinds of data and has its own unique set of properties based on the resource generating the data. Most data sources will write to their own tables in a Log Analytics workspace.

A Log Analytics workspace provides:

  • A geographic location for data storage.
  • Data isolation by granting different users access rights following one of our recommended design strategies.
  • Scope for configuration of settings like pricing tier, retention, and data capping.

As discussed overview of the design and migration considerations, access control overview, and an understanding of the design implementations recommended for your IT enterprise.

Best practices  

We learned several important lessons with our UTP implementation for SAP on Azure. These lessons helped inform our progress of UTP development, and they’ve given us best practices to leverage in future projects, including:

  • Perform a proper inventory of internal processes. You must be aware of events within a process before you can capture them. Performing a complete and informed inventory of your business processes is critical to capturing the data required for end-to-end business-process monitoring.
  • Build for true end-to-end telemetry. Capture all events from all processes and gather telemetry appropriately. Data points from all parts of the business process—including external components—are critical to achieving true end-to-end telemetry.
  • Build for Azure-native SAP. SAP on Azure is easier, and instrumenting SAP processes becomes more efficient and effective when SAP components are built for Azure.
  • Encourage data-usage models and standards across the organization. Data standards are critical for an accurate end-to-end view. If data is stored in different formats or instrumentation in various parts of the business process, the end reporting results won’t accurately represent the state of the business process.

Conclusion  

Microsoft/ Azure continually refining and improving business-process monitoring of SAP on Azure with UTP. It has enabled enterprise to keep key business users informed of business process flow, provided a complete view of business process health to leadership, and helped our engineering teams create a more robust and efficient SAP environment. Telemetry and business-driven monitoring with UTP have transformed the visibility we have into our SAP on Azure environment, and continuing journey toward deeper business insight and intelligence is making entire business better.