Category Archives: April 2018

C# Corner MVP 2020-2021

I’m so Thankful, honored, and excited to receive/ renewed C# Corner Most Valuable Professional (MVP) Award for the 2020-2021 for 4th time!!

‪I am sincerely humbled to be part of such an incredible CSharpCorner community!

This year I have talked at 11 conferences & User Groups where approx. 4000+ people attended, wrote many blogs & articles at my website and different technical portals!

In addendum, I talked at 10+ webcasts / podcasts / Live Interviews and reached to 3000+ attendees, Made a lot of #Azure Videos, mentored couple of students/ professionals on #Azure / suggestion / advice / resources and #saskatchewan Job Market insights!

Again, Many Thanks CSharpCorner Team!

Microsoft MVP Renewal 2020-2021

Wohoo 🙌 !!

I’m so Thankful, honored, and excited to receive/ renewed Microsoft Most Valuable Professional (MVP) Award for the 2020-2021 in #AZURE Category for 3rd time!!

‪I am sincerely humbled to be part of such an incredible community, celebrating #CanadaDay while being recognized for my contributions!!

This year I have talked at 11 🌎 conferences & User Groups where approx. 4000+ people attended, wrote many blogs & articles at my website and different technical portals!

In addendum, I talked at 10+ webcasts / podcasts / Live Interviews and reached to 3000+ attendees, Made a lot of #Azure Videos, mentored couple of students/ professionals on #Azure / suggestion / advice / resources and #saskatchewan Job Market insights!!

Again Many Thanks 🎉👏

#MVPBuzz#MVPAward#CdnMVP#MVP#MicrosoftMVP

This is my 3rd MVP Award and I am very grateful and appreciative for this honor and for the various opportunities provided to me over time.

Thank you very much to each and every one of you for making me successful in my efforts as a MVP, IT Professional, and community contributor, and for providing me with the valuable resources and networking opportunities. Thank you!

Azure -Disaster Recovery for SAP HANA Systems

Microsoft Azure provides a trusted path to enterprise-ready innovation with SAP solutions in the cloud. Mission critical applications such as SAP run reliably on Azure, which is an enterprise proven platform offering hyperscale, agility, and cost savings for running a customer’s SAP landscape.

System availability and disaster recovery are crucial for customers who run mission-critical SAP applications on Azure.

RTO and RPO are two key metrics that organizations consider in order to develop an appropriate disaster recovery plan that can maintain business continuity due to an unexpected event. 

Recovery point objective refers to the amount of data at risk in terms of “Time” whereas Recovery Time Objective refers to the amount of time or the maximum tolerable time that system can be down after disaster occurs.

The below diagram gives a view of RPO and RTO on a timeline view in a business as usual (BAU) scenario.

Design principles for disaster recovery systems

  • Selection of DR Region based on SAP Certified VMs for SAP HANA – It is important to verify the availability of SAP Certified VMs types in DR Region.
  • RPO and RTO Values – Businesses need to lay out clear expectations in RPO and RTO values which greatly affect the architecture for Disaster Recovery and requirements of tools and automation required to implement Disaster Recovery
    • Cost of Implementing DR, Maintenance and DR Drills
    • Criticality of systems – It is possible to establish Trade-off between Cost of DR implementation and Business Requirements. While most critical systems can utilize state of the art DR architecture, medium and less critical systems may afford higher RPO/RTO values.
    • On Demand Resizing of DR instances – It is preferable to use small size VMs for DR instances and upsize those during active DR scenario. It is also possible to reserve the required capacity of VMs at DR region so that there is no “waiting” time to upscale the VMs.
    • Additional considerations for cloud infrastructure costs, efforts in setting up environment for Non-disruptive DR Tests. Non-disruptive DR Tests refers to executing DR Tests without performing failover of actual productive systems to DR systems thereby avoiding any business downtimes. This involves additional costs for setting up temporary infrastructure which is in completely isolated vNet during the DR Tests.
    • Certain components in SAP systems architecture such as clustered network file system (NFS) which are not recommended to be replicated using Azure Site Recovery, hence there is a need for additional tools with license costs such as SUSE Geo-cluster or SIOS Data keeper for NFS Layer DR.
  • Azure offers “Azure Site Recovery (ASR)” which replicates the virtual machines across the region, this technology is used at non-database components or layers of the system while database specific methods such as SAP HANA system replication (HSR) are used at database layer to ensure consistency of databases.

Disaster recovery architecture for SAP systems running on SAP HANA Database

At a very high level, the below diagram depicts the architecture of SAP systems based on SAP HANA and which systems will be available in case of local or regional failures.

The diagram below gives next level details of SAP HANA systems components and corresponding technology used for achieving disaster recovery.

Steps for invoking DR or a DR drill

Microsoft Azure Site Recovery (ASR) helps in faster replication of data at the DR region.

Steps for Invoking DR or a DR drill:

  • DNS Changes for VMs to use new IP addresses
  • Bring up iSCSI – single VM from ASR Replicated data
  • Recover Databases and Resize the VMs to required capacity
  • Manually provision NFS – Single VM using snapshot backups
  • Build Application layer VMs from ASR Replicated data
  • Perform cluster changes
  • Bring up applications
  • Validate Applications
  • Release systems

A screenshot of an example DR drill plan.

Resiliency/Reliability:

Azure keeps your applications up and running and your data available. Azure is the first cloud platform to provide a built-in backup and disaster recovery solution.

Resiliency is not about avoiding failures but responding to failures. The objective is to respond to failure in a way that avoids downtime and data loss. Business continuity and data protection are critical issues for today’s organizations, and business continuity is built on the foundation of resilient systems, applications, and data.

Reliability and resiliency are closely related. Reliability is defined as dependability and performing consistently well. Resiliency is defined as the capacity to recover quickly. Together, these two qualities are key to a trustworthy cloud service. Despite best efforts, disasters happen; they are inevitable but mostly unpredictable, and vary in type and magnitude. There is almost never a single root cause of a major issue. Instead, there are several contributing factors, which is the reason an issue is able to circumvent various layers of mitigation/defenses.

Building Azure Monitoring, Logging and Alerting Foundation for SAP application

Introduction

Migrating SAP systems to Azure, Microsoft fine-tuned it’s capacity management processes, minimizing downtime, risk, and costs and improving employee efficiencies. Optimizing on Azure allows us to design an SAP environment that is agile, efficient, and flexible to grow and change with our business. need (least-privileged).

As we decided to migrate your SAP systems to Azure. It’s a big move, and taking the right steps can make the transition smooth and manageable. IoTCoast2Coast took a measured approach to moving most sensitive data and confidential workloads with SAP systems.

The right approach makes it possible to migrate mission-critical SAP systems to Azure, gaining maximum cost savings, scalability, and agility, without disrupting business operations. Our horizontal strategy meant moving low-risk environments like our sandboxes first, giving us experience with Azure migration without risking critical business functions in the process. Using a vertical strategy to move entire low-impact systems gave us experience with Azure production processes.

Prerequisite

To configure Azure AD integration with SAP Cloud Platform, you need the following items:

  1. Azure Subscription
  2. Basic Azure knowledge
  3. An Azure AD tenant
  4. SAP Cloud Platform Identity Authentication tenant
  5. A user account in SAP Cloud Platform Identity Authentication with Admin permissions.
  6. An Azure AD subscription. If you don’t have an Azure AD environment, you can get one-month trial here
  7. SAP Cloud Platform single sign-on enabled subscription

Definition

Throughout the document, these terms are used:

IaaS: Infrastructure as a service.

PaaS: Platform as a service.

SaaS: Software as a service.

Creating the best SAP environment with Azure

Azure is the preferred platform for SAP. As the top SAP certified cloud provider, Azure able to reliably run mission critical SAP environment on a trusted cloud platform built for enterprises. Azure meets scalability, flexibility, and compliance needs.

Azure can run the most complete set of SAP applications across dev-test and production scenarios in Azure—and be fully supported. Azure is certified for more SAP solutions than any other cloud provider, including solutions like SAP HANA and S/4 HANA, SAP Business Suite, SAP NetWeaver, and SAP Business One to name a few.

Azure also carries a large number of benefits when hosting the SAP platform, including:

Creating a telemetry solution for SAP on Azure

The distributed nature of our business process environment led us to examine a broader solution—one that would provide comprehensive telemetry and monitoring for our SAP landscape, but also for any other business processes that comprised the end-to-end business landscape at Microsoft. Our implementation was driven by the following important goals:

Goals and drivers

Microsoft developed a telemetry platform in Azure called as the Unified Telemetry Platform (UTP). UTP is a modern, scalable, reliable, and cost-effective telemetry platform that’s used in several different business process monitoring scenarios in Microsoft, including our SAP-related business processes.

UTP is built to enable service maturity and business process monitoring across CSEO. It provides a common telemetry taxonomy and integration with core Microsoft data monitoring services. UTP enables compliance and the maintenance of business standards for data integrity and privacy. While UTP is the implementation we chose, there are numerous ways to enable telemetry on Azure.

Capturing telemetry with Azure Monitor

To enable business-driven monitoring and a user-centric approach, UTP captures as many of the critical events within the end-to-end process landscape as possible. Embracing comprehensive telemetry in our systems meant capturing data from all available endpoints to build an understanding of how each process flowed and which of the SAP components were involved. Azure Monitor and its related Azure services serve as the core for our solution.

Azure Application Insights

Application Insights provides an Azure-based solution with which we can dig deep into our Azure-hosted SAP landscape and pull out all necessary telemetry data. Using Application insights, we can automatically generate alerts and support tickets when our telemetry indicates a potential error situation.

Azure Log Analytics

Infrastructure telemetry such as CPU usage, disk throughput and other performance-related data is collected from Azure infrastructure components in the SAP environment using Log Analytics.

Azure Data Explorer

UTP uses Azure Data Explorer as the central repository for all telemetry data sent through Application Insights and Azure Monitor Logs from our application and infrastructure environment. Azure Data Explorer provides enterprise big data interactive analytics; we use the Kusto query language to stitch together the end-to-end transaction flow for our business processes, for both SAP process and non-SAP processes.

Azure Data Lake

UTP uses Azure Data Lake for long-term cold data storage. This data is taken out of the hot and warm streams and kept for reporting and archival purposes in Azure Data Lake to reduce the cost associated with storing large amounts of data in Azure Monitor.

Implementing UTP in SAP on Azure

The first step in enabling our telemetry platform was to create a reusable custom method and configuration table to drive consistent creation of the telemetry payloads. The configuration table defines the fixed structure of the payload according to the UTP standards.

The method then allows the calling application to pass an application-specific payload to populate the dynamic properties section of the telemetry events payload, and then adds SAP standard elements such as the event date and time, and system identifier. This method can then be called directly from any ABAP code, in either synchronous or asynchronous modes.

For example, in most business processes in our ERP, we use SAP business process events to trigger our telemetry events. The business process events share a custom check routine framework built using SAP Business Rule Framework plus; then custom receiver classes build the dynamic properties of the payload and call the shared telemetry class.

When each event in the workflow is processed in SAP, the JSON payload is passed to Application Insights using an external REST service call, which connects to the UTP framework. The following figure contains an example from our non-delivery order-to-cash process.

Azure Monitor

Azure Monitor maximizes the availability and performance of your applications and services by delivering a comprehensive solution for collecting, analyzing, and acting on telemetry from your cloud and on-premises environments.

Azure Monitor include:

The below diagram gives a high-level view of Azure Monitor. At the center of the diagram are the data stores for metrics and logs, which are the two fundamental types of data use by Azure Monitor.

On the left are the sources of monitoring data that populate these data stores. On the right are the different functions that Azure Monitor performs with this collected data such as analysis, alerting, and streaming to external systems.

Monitoring data platform:

All data collected by Azure Monitor fits into one of two fundamental types, metrics and logs. Metrics are numerical values that describe some aspect of a system at a particular point in time. They are lightweight and capable of supporting near real-time scenarios. Logs contain different kinds of data organized into records with different sets of properties for each type. Telemetry such as events and traces are stored as logs in addition to performance data so that it can all be combined for analysis.

Log data collected by Azure Monitor can be analyzed with queries to quickly retrieve, consolidate, and analyze collected data. You can create and test queries using Log Analytics in the Azure portal and then either directly analyze the data using these tools or save queries for use with visualizations or alert rules.

Azure Monitor uses a version of the Kusto query language used by Azure Data Explorer that is suitable for simple log queries but also includes advanced functionality such as aggregations, joins, and smart analytics. You can quickly learn the query language using multiple lessons.

Data collected by Azure Monitor ?   

Azure Monitor can collect data from a variety of sources. You can think of monitoring data for your applications in tiers ranging from your application, any operating system and services it relies on, down to the platform itself. Azure Monitor collects data from each of the following tiers:

Azure Insights:

Azure Monitoring data is only useful if it can increase your visibility into the operation of your computing environment. Azure Monitor includes several features and tools that provide valuable insights into your applications and other resources that they depend on. Monitoring solutions and features such as Application Insights and Azure Monitor for containers provide deep insights into different aspects of your application and specific Azure services.

Application Insights

Application Insights monitors the availability, performance, and usage of your web applications whether they’re hosted in the cloud or on-premises. It leverages the powerful data analysis platform in Azure Monitor to provide you with deep insights into your application’s operations and diagnose errors without waiting for a user to report them. Application Insights includes connection points to a variety of development tools and integrates with Visual Studio to support your DevOps processes.

Azure Lighthouse

Azure Lighthouse offers service providers a single control plane to view and manage Azure across all their customers with higher automation, scale, and enhanced governance. With Azure Lighthouse, service providers can deliver managed services using comprehensive and robust management tooling built into the Azure platform. This offering can also benefit enterprise IT organizations managing resources across multiple tenants.

Benefits

Azure Lighthouse helps you to profitably and efficiently build and deliver managed services for your customers. The benefits include:

  • Management at scale: Customer engagement and life-cycle operations to manage customer resources are easier and more scalable.
  • Greater visibility and precision for customers: Customers whose resources you’re managing will have greater visibility into your actions and precise control over the scope they delegate for management, while your IP is preserved.
  • Comprehensive and unified platform tooling: Our tooling experience addresses key service provider scenarios, including multiple licensing models such as EA, CSP and pay-as-you-go. The new capabilities work with existing tools and APIs, licensing models, and partner programs such as the Cloud Solution Provider program (CSP). The Azure Lighthouse options you choose can be integrated into your existing workflows and applications, and you can track your impact on customer engagements by linking your partner ID.

There are no additional costs associated with using Azure Lighthouse to manage your customers’ Azure resources.

Capabilities

Azure Lighthouse includes multiple ways to help streamline customer engagement and management:

  • Azure delegated resource management: Manage your customers’ Azure resources securely from within your own tenant, without having to switch context and control planes. For more info, see Azure delegated resource management.
  • New Azure portal experiences: View cross-tenant info in the new My customers page in the Azure portal. A corresponding Service providers blade lets your customers view and manage service provider access. For more info, see View and manage customers and View and manage service providers.
  • Azure Resource Manager templates: Perform management tasks more easily, including onboarding customers for Azure delegated resource management. For more info, see our samples repo and Onboard a customer to Azure delegated resource management.
  • Managed Services offers in Azure Marketplace: Offer your services to customers through private or public offers, and have them automatically onboarded to Azure delegated resource management, as an alternate to onboarding using Azure Resource Manager templates. For more info, see Managed services offers in Azure Marketplace.
  • Azure managed applications: Package and ship applications that are easy for your customers to deploy and use in their own subscriptions. The application is deployed into a resource group that you access from your tenant, letting you manage the service as part of the overall Azure Lighthouse experience. For more info, see Azure managed applications overview.

Azure Monitor Logs

Azure Monitor stores log data in a Log Analytics workspace, which is an Azure resource and a container where data is collected, aggregated, and serves as an administrative boundary. While you can deploy one or more workspaces in your Azure subscription, there are several considerations you should understand in order to ensure your initial deployment is following our guidelines to provide you with a cost effective, manageable, and scalable deployment meeting your organizations needs.

Data in a workspace is organized into tables, each of which stores different kinds of data and has its own unique set of properties based on the resource generating the data. Most data sources will write to their own tables in a Log Analytics workspace.

A Log Analytics workspace provides:

  • A geographic location for data storage.
  • Data isolation by granting different users access rights following one of our recommended design strategies.
  • Scope for configuration of settings like pricing tier, retention, and data capping.

As discussed overview of the design and migration considerations, access control overview, and an understanding of the design implementations recommended for your IT enterprise.

Best practices  

We learned several important lessons with our UTP implementation for SAP on Azure. These lessons helped inform our progress of UTP development, and they’ve given us best practices to leverage in future projects, including:

  • Perform a proper inventory of internal processes. You must be aware of events within a process before you can capture them. Performing a complete and informed inventory of your business processes is critical to capturing the data required for end-to-end business-process monitoring.
  • Build for true end-to-end telemetry. Capture all events from all processes and gather telemetry appropriately. Data points from all parts of the business process—including external components—are critical to achieving true end-to-end telemetry.
  • Build for Azure-native SAP. SAP on Azure is easier, and instrumenting SAP processes becomes more efficient and effective when SAP components are built for Azure.
  • Encourage data-usage models and standards across the organization. Data standards are critical for an accurate end-to-end view. If data is stored in different formats or instrumentation in various parts of the business process, the end reporting results won’t accurately represent the state of the business process.

Conclusion  

Microsoft/ Azure continually refining and improving business-process monitoring of SAP on Azure with UTP. It has enabled enterprise to keep key business users informed of business process flow, provided a complete view of business process health to leadership, and helped our engineering teams create a more robust and efficient SAP environment. Telemetry and business-driven monitoring with UTP have transformed the visibility we have into our SAP on Azure environment, and continuing journey toward deeper business insight and intelligence is making entire business better.

Join us in Regina and learn ‘Leveraging IoT Device in Hydroelectric, Wind Power, Transformers, Heat Recovery & Power Stations and Azure Security Insight’

Leveraging IoT Device in Hydroelectric, Wind Power, Transformers, Heat Recovery & Power Stations and Azure Security Insight

“Azure IOT Coast 2 Coast” First Tour

Stream Big Data and Secure Your Data

Join Deepak and Nik for two presentation focusing on …

  1. Leveraging IoT Device in Hydroelectric, Wind Power, Transformers, Heat Recovery & Power Stations.
Agenda / Topics
IoT SolutionsAzure IoT Central (SaaS)
Azure IoT Solution Accelerator (PaaS)
PaaS Services & IoT Services
Azure IoT Central PortalSetting up a real IoT Device in to Azure IoT Central (demo).
Hydro/Power Transformers in Azure IoT central (demo).  
Nik – Shahriar Nikkhah
  • Azure security defenses you ought to know
Agenda / Topics
How Cloud Security is different & Better
Demo: Azure Advisor, Azure Security Center
Demo: Identity and access management
Advanced Threat Protection for your data  
Deepak Kaushik

Venue: Sunrise Branch Library

  3130 Woodhams Dr, Regina, SK S4V 2P9

  Regina Saskatchewan CANADA

Time:               1 PM – 4 PM

Price:               Free of cost

Parking:           Free of cost 

Speaker Details:


Nik Shahriar:

Nik

Nik is a consultant, Data engineer, tech lead, mentor and founder of “SQL Data Side Inc” and Co-Founder of “Azure IoT Coast 2 Coast”focusing on Microsoft Azure technologies.

Nik has over 25 years of experience in the data field beginning his career as a software developer and programmer who quickly focused on backend products such as SQL server and business intelligence, after the birth of cloud/azure technologies he started adding Azure IoT products to his list.

He is also a C# Corner MVP.

You can find out more about him and his presentation at this link. https://www.linkedin.com/in/nnikkhah/

Deepak Kaushik

Deepak

Deepak is a Microsoft Azure MVP. He is the founder and Chapter Lead at C# Corner Regina Chapter and Co-Founder of “Azure IoT Coast 2 Coast” focusing on Microsoft Azure technologies.

He is also a C# Corner MVP. Find more about Deepak at

https://deepak-kaushik.com/

https://www.linkedin.com/in/davekaushik/

Sponsors:        —

Learn IoT Device Translator & Azure Security Insight at Saskatoon

‘Azure IOT and Azure Defenses: Coast to Coast Tour’ announcing sessions at 3 different cities. We will be at:

Calgary on April 27th.

Saskatoon on May 3rd .

Regina on May 4th  .

Join us for the great sessions on ‘IOT and Azure Coast to Coast’ at Saskatoon – Saskatchewan. Sessions will be held at Cliff Wright Branch Library on May 3rd

“Learn IoT Device Translator & Azure Security Insight”.

Venue: Cliff Wright Branch Library

Address: 1635 McKercher Dr, Saskatoon, SK S7H 5J9

Free Parking

Time: 5:30 PM – 8 PM

IoT Device Translator Nik Shahriar

6 PM-7 PM

Nik Shahriar 
  • Environmental settings (Visual Studio Code, Arduino, …)
  • IoT Hub translator architecture
  • IoT Hub, Speech Service, Azure Function.
  • Chinese to English, French to English, real demo

We will have Nik Shahriar from Toronto and he is presenting at Saskatoon and Regina.

Azure Security Insight Deepak

Deepak

7:15 PM – 8:15 PM

  • How Cloud Security is different & Better
  • Demo: Azure Advisor, Azure Security Center
  • Demo: Security Playbook & Identity and access management
  • Advanced Threat Protection for your data

We will serve Pizza, Snacks and Beverages

Learn Transfer Learning for AI & Azure Defenses

Join us for a new webinar on “Learn Transfer Learning for AI & Azure Defenses”. Transfer learning is embracing the concept of artificial general intelligence. Artificial general intelligence is leading us to make intelligent machines with less computations and data. We can use one pre-trained network for multiple tasks through transfer learning and get better accuracy with least amount of data. In this session, we will talk about different approaches of transfer learning and ways to use it. 

 Event URL:TBD

Time:
March 09, 2019 9:00 AM (CT) 

Agenda:

Protect yourself by using Awesome Azure defenses

  • How Cloud Security is different & Better
  • Demo: Azure Advisor, Azure Security Center
  • Demo: Security Playbook & Identity and access management
  • Advanced Threat Protection for your data

Transfer Learning for Artificial General Intelligence

  • Artificial general intelligence is leading us to make intelligent machines with less computations and data
  • Use one pre-trained network for multiple tasks through transfer learning
  • Different approaches of transfer learning and ways to use it.
  • Demo

Session details are as follows, 

Protect yourself by using Awesome Azure defenses
Deepak Kaushik
09:00 AM – 09:45 AM
Transfer Learning for Artificial General Intelligence
Rahat Yasir09:45 AM – 10:30 AM

My Journey as C# Corner MVP (Renewed July 2018 – Jun 2019)

July 1st is Canada Day and I’ve been fortunate to get the official notification earlier today that I’ve been re-awarded for the next year.

Woohoo! It’s a great way to start Canada Day!

2018 First Half of the Year MVPs Announced

C# Corner MVP Renewed

 

C# Corner MVP 2018

What it means to be a C# Corner MVP?

It is a recognition for contributors on the C# Corner platform – website, forums, chapter events and webinars. It is an appreciation of hard work and contributions by members who make C# Corner a successful community.

Why is it important to me?

C# Corner MVP was the first recognition/award I received since I started contributing to the technical community. This is my second time C# Corner MVP award. It gives a sense of fulfilment that your efforts/contributions are recognized and those don’t go unnoticed. Getting C# Corner MVP award mentally prepared me to take up the challenge to become Microsoft MVP.

Why I got C# Corner MVP award?

C# Corner MVP award is not limited to any technology or a specific domain. It is given to individuals for their contributions on C# Corner platform as well as to those who help spread the awareness about C# Corner by hosting public events either online or in-person.

What’s next?

This award is a boost for me. I will continue to contribute to the C# Corner by articles, videos and forums section with many more C# Corner chapter events and webinars.

 

Thanks..