Monthly Archives: March 2020

Azure -Disaster Recovery for SAP HANA Systems

Microsoft Azure provides a trusted path to enterprise-ready innovation with SAP solutions in the cloud. Mission critical applications such as SAP run reliably on Azure, which is an enterprise proven platform offering hyperscale, agility, and cost savings for running a customer’s SAP landscape.

System availability and disaster recovery are crucial for customers who run mission-critical SAP applications on Azure.

RTO and RPO are two key metrics that organizations consider in order to develop an appropriate disaster recovery plan that can maintain business continuity due to an unexpected event. 

Recovery point objective refers to the amount of data at risk in terms of “Time” whereas Recovery Time Objective refers to the amount of time or the maximum tolerable time that system can be down after disaster occurs.

The below diagram gives a view of RPO and RTO on a timeline view in a business as usual (BAU) scenario.

Design principles for disaster recovery systems

  • Selection of DR Region based on SAP Certified VMs for SAP HANA – It is important to verify the availability of SAP Certified VMs types in DR Region.
  • RPO and RTO Values – Businesses need to lay out clear expectations in RPO and RTO values which greatly affect the architecture for Disaster Recovery and requirements of tools and automation required to implement Disaster Recovery
    • Cost of Implementing DR, Maintenance and DR Drills
    • Criticality of systems – It is possible to establish Trade-off between Cost of DR implementation and Business Requirements. While most critical systems can utilize state of the art DR architecture, medium and less critical systems may afford higher RPO/RTO values.
    • On Demand Resizing of DR instances – It is preferable to use small size VMs for DR instances and upsize those during active DR scenario. It is also possible to reserve the required capacity of VMs at DR region so that there is no “waiting” time to upscale the VMs.
    • Additional considerations for cloud infrastructure costs, efforts in setting up environment for Non-disruptive DR Tests. Non-disruptive DR Tests refers to executing DR Tests without performing failover of actual productive systems to DR systems thereby avoiding any business downtimes. This involves additional costs for setting up temporary infrastructure which is in completely isolated vNet during the DR Tests.
    • Certain components in SAP systems architecture such as clustered network file system (NFS) which are not recommended to be replicated using Azure Site Recovery, hence there is a need for additional tools with license costs such as SUSE Geo-cluster or SIOS Data keeper for NFS Layer DR.
  • Azure offers “Azure Site Recovery (ASR)” which replicates the virtual machines across the region, this technology is used at non-database components or layers of the system while database specific methods such as SAP HANA system replication (HSR) are used at database layer to ensure consistency of databases.

Disaster recovery architecture for SAP systems running on SAP HANA Database

At a very high level, the below diagram depicts the architecture of SAP systems based on SAP HANA and which systems will be available in case of local or regional failures.

The diagram below gives next level details of SAP HANA systems components and corresponding technology used for achieving disaster recovery.

Steps for invoking DR or a DR drill

Microsoft Azure Site Recovery (ASR) helps in faster replication of data at the DR region.

Steps for Invoking DR or a DR drill:

  • DNS Changes for VMs to use new IP addresses
  • Bring up iSCSI – single VM from ASR Replicated data
  • Recover Databases and Resize the VMs to required capacity
  • Manually provision NFS – Single VM using snapshot backups
  • Build Application layer VMs from ASR Replicated data
  • Perform cluster changes
  • Bring up applications
  • Validate Applications
  • Release systems

A screenshot of an example DR drill plan.

Resiliency/Reliability:

Azure keeps your applications up and running and your data available. Azure is the first cloud platform to provide a built-in backup and disaster recovery solution.

Resiliency is not about avoiding failures but responding to failures. The objective is to respond to failure in a way that avoids downtime and data loss. Business continuity and data protection are critical issues for today’s organizations, and business continuity is built on the foundation of resilient systems, applications, and data.

Reliability and resiliency are closely related. Reliability is defined as dependability and performing consistently well. Resiliency is defined as the capacity to recover quickly. Together, these two qualities are key to a trustworthy cloud service. Despite best efforts, disasters happen; they are inevitable but mostly unpredictable, and vary in type and magnitude. There is almost never a single root cause of a major issue. Instead, there are several contributing factors, which is the reason an issue is able to circumvent various layers of mitigation/defenses.

Building Azure Monitoring, Logging and Alerting Foundation for SAP application

Introduction

Migrating SAP systems to Azure, Microsoft fine-tuned it’s capacity management processes, minimizing downtime, risk, and costs and improving employee efficiencies. Optimizing on Azure allows us to design an SAP environment that is agile, efficient, and flexible to grow and change with our business. need (least-privileged).

As we decided to migrate your SAP systems to Azure. It’s a big move, and taking the right steps can make the transition smooth and manageable. IoTCoast2Coast took a measured approach to moving most sensitive data and confidential workloads with SAP systems.

The right approach makes it possible to migrate mission-critical SAP systems to Azure, gaining maximum cost savings, scalability, and agility, without disrupting business operations. Our horizontal strategy meant moving low-risk environments like our sandboxes first, giving us experience with Azure migration without risking critical business functions in the process. Using a vertical strategy to move entire low-impact systems gave us experience with Azure production processes.

Prerequisite

To configure Azure AD integration with SAP Cloud Platform, you need the following items:

  1. Azure Subscription
  2. Basic Azure knowledge
  3. An Azure AD tenant
  4. SAP Cloud Platform Identity Authentication tenant
  5. A user account in SAP Cloud Platform Identity Authentication with Admin permissions.
  6. An Azure AD subscription. If you don’t have an Azure AD environment, you can get one-month trial here
  7. SAP Cloud Platform single sign-on enabled subscription

Definition

Throughout the document, these terms are used:

IaaS: Infrastructure as a service.

PaaS: Platform as a service.

SaaS: Software as a service.

Creating the best SAP environment with Azure

Azure is the preferred platform for SAP. As the top SAP certified cloud provider, Azure able to reliably run mission critical SAP environment on a trusted cloud platform built for enterprises. Azure meets scalability, flexibility, and compliance needs.

Azure can run the most complete set of SAP applications across dev-test and production scenarios in Azure—and be fully supported. Azure is certified for more SAP solutions than any other cloud provider, including solutions like SAP HANA and S/4 HANA, SAP Business Suite, SAP NetWeaver, and SAP Business One to name a few.

Azure also carries a large number of benefits when hosting the SAP platform, including:

Creating a telemetry solution for SAP on Azure

The distributed nature of our business process environment led us to examine a broader solution—one that would provide comprehensive telemetry and monitoring for our SAP landscape, but also for any other business processes that comprised the end-to-end business landscape at Microsoft. Our implementation was driven by the following important goals:

Goals and drivers

Microsoft developed a telemetry platform in Azure called as the Unified Telemetry Platform (UTP). UTP is a modern, scalable, reliable, and cost-effective telemetry platform that’s used in several different business process monitoring scenarios in Microsoft, including our SAP-related business processes.

UTP is built to enable service maturity and business process monitoring across CSEO. It provides a common telemetry taxonomy and integration with core Microsoft data monitoring services. UTP enables compliance and the maintenance of business standards for data integrity and privacy. While UTP is the implementation we chose, there are numerous ways to enable telemetry on Azure.

Capturing telemetry with Azure Monitor

To enable business-driven monitoring and a user-centric approach, UTP captures as many of the critical events within the end-to-end process landscape as possible. Embracing comprehensive telemetry in our systems meant capturing data from all available endpoints to build an understanding of how each process flowed and which of the SAP components were involved. Azure Monitor and its related Azure services serve as the core for our solution.

Azure Application Insights

Application Insights provides an Azure-based solution with which we can dig deep into our Azure-hosted SAP landscape and pull out all necessary telemetry data. Using Application insights, we can automatically generate alerts and support tickets when our telemetry indicates a potential error situation.

Azure Log Analytics

Infrastructure telemetry such as CPU usage, disk throughput and other performance-related data is collected from Azure infrastructure components in the SAP environment using Log Analytics.

Azure Data Explorer

UTP uses Azure Data Explorer as the central repository for all telemetry data sent through Application Insights and Azure Monitor Logs from our application and infrastructure environment. Azure Data Explorer provides enterprise big data interactive analytics; we use the Kusto query language to stitch together the end-to-end transaction flow for our business processes, for both SAP process and non-SAP processes.

Azure Data Lake

UTP uses Azure Data Lake for long-term cold data storage. This data is taken out of the hot and warm streams and kept for reporting and archival purposes in Azure Data Lake to reduce the cost associated with storing large amounts of data in Azure Monitor.

Implementing UTP in SAP on Azure

The first step in enabling our telemetry platform was to create a reusable custom method and configuration table to drive consistent creation of the telemetry payloads. The configuration table defines the fixed structure of the payload according to the UTP standards.

The method then allows the calling application to pass an application-specific payload to populate the dynamic properties section of the telemetry events payload, and then adds SAP standard elements such as the event date and time, and system identifier. This method can then be called directly from any ABAP code, in either synchronous or asynchronous modes.

For example, in most business processes in our ERP, we use SAP business process events to trigger our telemetry events. The business process events share a custom check routine framework built using SAP Business Rule Framework plus; then custom receiver classes build the dynamic properties of the payload and call the shared telemetry class.

When each event in the workflow is processed in SAP, the JSON payload is passed to Application Insights using an external REST service call, which connects to the UTP framework. The following figure contains an example from our non-delivery order-to-cash process.

Azure Monitor

Azure Monitor maximizes the availability and performance of your applications and services by delivering a comprehensive solution for collecting, analyzing, and acting on telemetry from your cloud and on-premises environments.

Azure Monitor include:

The below diagram gives a high-level view of Azure Monitor. At the center of the diagram are the data stores for metrics and logs, which are the two fundamental types of data use by Azure Monitor.

On the left are the sources of monitoring data that populate these data stores. On the right are the different functions that Azure Monitor performs with this collected data such as analysis, alerting, and streaming to external systems.

Monitoring data platform:

All data collected by Azure Monitor fits into one of two fundamental types, metrics and logs. Metrics are numerical values that describe some aspect of a system at a particular point in time. They are lightweight and capable of supporting near real-time scenarios. Logs contain different kinds of data organized into records with different sets of properties for each type. Telemetry such as events and traces are stored as logs in addition to performance data so that it can all be combined for analysis.

Log data collected by Azure Monitor can be analyzed with queries to quickly retrieve, consolidate, and analyze collected data. You can create and test queries using Log Analytics in the Azure portal and then either directly analyze the data using these tools or save queries for use with visualizations or alert rules.

Azure Monitor uses a version of the Kusto query language used by Azure Data Explorer that is suitable for simple log queries but also includes advanced functionality such as aggregations, joins, and smart analytics. You can quickly learn the query language using multiple lessons.

Data collected by Azure Monitor ?   

Azure Monitor can collect data from a variety of sources. You can think of monitoring data for your applications in tiers ranging from your application, any operating system and services it relies on, down to the platform itself. Azure Monitor collects data from each of the following tiers:

Azure Insights:

Azure Monitoring data is only useful if it can increase your visibility into the operation of your computing environment. Azure Monitor includes several features and tools that provide valuable insights into your applications and other resources that they depend on. Monitoring solutions and features such as Application Insights and Azure Monitor for containers provide deep insights into different aspects of your application and specific Azure services.

Application Insights

Application Insights monitors the availability, performance, and usage of your web applications whether they’re hosted in the cloud or on-premises. It leverages the powerful data analysis platform in Azure Monitor to provide you with deep insights into your application’s operations and diagnose errors without waiting for a user to report them. Application Insights includes connection points to a variety of development tools and integrates with Visual Studio to support your DevOps processes.

Azure Lighthouse

Azure Lighthouse offers service providers a single control plane to view and manage Azure across all their customers with higher automation, scale, and enhanced governance. With Azure Lighthouse, service providers can deliver managed services using comprehensive and robust management tooling built into the Azure platform. This offering can also benefit enterprise IT organizations managing resources across multiple tenants.

Benefits

Azure Lighthouse helps you to profitably and efficiently build and deliver managed services for your customers. The benefits include:

  • Management at scale: Customer engagement and life-cycle operations to manage customer resources are easier and more scalable.
  • Greater visibility and precision for customers: Customers whose resources you’re managing will have greater visibility into your actions and precise control over the scope they delegate for management, while your IP is preserved.
  • Comprehensive and unified platform tooling: Our tooling experience addresses key service provider scenarios, including multiple licensing models such as EA, CSP and pay-as-you-go. The new capabilities work with existing tools and APIs, licensing models, and partner programs such as the Cloud Solution Provider program (CSP). The Azure Lighthouse options you choose can be integrated into your existing workflows and applications, and you can track your impact on customer engagements by linking your partner ID.

There are no additional costs associated with using Azure Lighthouse to manage your customers’ Azure resources.

Capabilities

Azure Lighthouse includes multiple ways to help streamline customer engagement and management:

  • Azure delegated resource management: Manage your customers’ Azure resources securely from within your own tenant, without having to switch context and control planes. For more info, see Azure delegated resource management.
  • New Azure portal experiences: View cross-tenant info in the new My customers page in the Azure portal. A corresponding Service providers blade lets your customers view and manage service provider access. For more info, see View and manage customers and View and manage service providers.
  • Azure Resource Manager templates: Perform management tasks more easily, including onboarding customers for Azure delegated resource management. For more info, see our samples repo and Onboard a customer to Azure delegated resource management.
  • Managed Services offers in Azure Marketplace: Offer your services to customers through private or public offers, and have them automatically onboarded to Azure delegated resource management, as an alternate to onboarding using Azure Resource Manager templates. For more info, see Managed services offers in Azure Marketplace.
  • Azure managed applications: Package and ship applications that are easy for your customers to deploy and use in their own subscriptions. The application is deployed into a resource group that you access from your tenant, letting you manage the service as part of the overall Azure Lighthouse experience. For more info, see Azure managed applications overview.

Azure Monitor Logs

Azure Monitor stores log data in a Log Analytics workspace, which is an Azure resource and a container where data is collected, aggregated, and serves as an administrative boundary. While you can deploy one or more workspaces in your Azure subscription, there are several considerations you should understand in order to ensure your initial deployment is following our guidelines to provide you with a cost effective, manageable, and scalable deployment meeting your organizations needs.

Data in a workspace is organized into tables, each of which stores different kinds of data and has its own unique set of properties based on the resource generating the data. Most data sources will write to their own tables in a Log Analytics workspace.

A Log Analytics workspace provides:

  • A geographic location for data storage.
  • Data isolation by granting different users access rights following one of our recommended design strategies.
  • Scope for configuration of settings like pricing tier, retention, and data capping.

As discussed overview of the design and migration considerations, access control overview, and an understanding of the design implementations recommended for your IT enterprise.

Best practices  

We learned several important lessons with our UTP implementation for SAP on Azure. These lessons helped inform our progress of UTP development, and they’ve given us best practices to leverage in future projects, including:

  • Perform a proper inventory of internal processes. You must be aware of events within a process before you can capture them. Performing a complete and informed inventory of your business processes is critical to capturing the data required for end-to-end business-process monitoring.
  • Build for true end-to-end telemetry. Capture all events from all processes and gather telemetry appropriately. Data points from all parts of the business process—including external components—are critical to achieving true end-to-end telemetry.
  • Build for Azure-native SAP. SAP on Azure is easier, and instrumenting SAP processes becomes more efficient and effective when SAP components are built for Azure.
  • Encourage data-usage models and standards across the organization. Data standards are critical for an accurate end-to-end view. If data is stored in different formats or instrumentation in various parts of the business process, the end reporting results won’t accurately represent the state of the business process.

Conclusion  

Microsoft/ Azure continually refining and improving business-process monitoring of SAP on Azure with UTP. It has enabled enterprise to keep key business users informed of business process flow, provided a complete view of business process health to leadership, and helped our engineering teams create a more robust and efficient SAP environment. Telemetry and business-driven monitoring with UTP have transformed the visibility we have into our SAP on Azure environment, and continuing journey toward deeper business insight and intelligence is making entire business better.

Build Azure Domain And AD integration Foundation

Introduction

 Azure manages and controls identity and user access to enterprise environments, data, and applications by federating user identities to Azure Active Directory and enabling multifactor authentication for more secure sign-in. Microsoft uses stringent identity management and access controls to limit data and systems access to those with a genuine business need (least-privileged). These include Azure Active Directory reporting, Azure Key Vault logs, Azure Storage Analytics, and more. Logs from your Azure resources can be integrated with your on-premises security information and event management (SIEM) system. Identity management is the process of authenticating and authorizing security principals. It also involves controlling information about those principals (identities). Security principals (identities) may include services, applications, users, groups, etc. Microsoft identity and access management solutions help IT protect access to applications and resources across the corporate datacenter and into the cloud. Such protection enables additional levels of validation, such as Multi-Factor Authentication and Conditional Access policies. Monitoring suspicious activity through advanced security reporting, auditing, and alerting helps mitigate potential security issues. Azure Active Directory Premium provides single sign-on (SSO) to thousands of cloud software as a service (SaaS) apps and access to web apps that you run on-premises. By taking advantage of the security benefits of Azure Active Directory (Azure AD), you can,

  • Create and manage a single identity for each user across your hybrid enterprise, keeping users, groups, and devices in sync.
  • Provide SSO access to your applications, including thousands of pre-integrated SaaS apps.
  • Enable application access security by enforcing rules-based Multi-Factor Authentication for both on-premises and cloud applications.
  • Provision secure remote access to on-premises web applications through Azure AD Application Proxy.

Azure Active Directory integration with SAP Cloud Platform

 Let’s integrate SAP Cloud Platform with Azure Active Directory (Azure AD). Integrating SAP Cloud Platform with Azure AD provides you with the following benefits:

  • You can control in Azure AD who has access to SAP Cloud Platform.
  • You can enable your users to be automatically signed-in to SAP Cloud Platform (Single Sign-On) with their Azure AD accounts.
  • You can manage your accounts in one central location – the Azure portal.

Prerequisite To configure Azure AD integration with SAP Cloud Platform, you need the following items,

  1. Azure Subscription
  2. Basic Azure knowledge
  3. An Azure AD tenant
  4. A SAP Cloud Platform Identity Authentication tenant
  5. A user account in SAP Cloud Platform Identity Authentication with Admin permissions.
  6. An Azure AD subscription. If you don’t have an Azure AD environment, you can get one-month trial here
  7. SAP Cloud Platform single sign-on enabled subscription

Definition Throughout the document, these terms are used,

  • IaaS: Infrastructure as a service.
  • PaaS: Platform as a service.
  • SaaS: Software as a service.

Abstract This response document helps address standard Requests for Information (RFI) with which IoTCoast2Coast empower customers to evaluate different offerings in the market place today. Through the mappings available in the CCM, we can illustrate how Azure has implemented security and privacy controls aligned to other international standards such as ISO/IEC 27001, US Government frameworks including FedRAMP, and industry certifications such as PCI DSS. Complexity A cloud-specific controls framework such as the Cloud Control Matrix (CCM) reduces the risk of an organization failing to consider important factors when selecting a cloud provider. The risk is further mitigated by relying on the cumulative knowledge of industry experts who created the framework, and taking advantage of the efforts of many offerings. Comparison For organizations that do not have detailed knowledge about the different ways that cloud providers can develop or configure their offerings, reviewing a fully developed framework can provide insight into how to compare similar offerings and distinguish between providers. A framework can also help determine whether a specific service offering meets or exceeds compliance requirements and/or relevant standards. 

Authorize access to Azure AD web applications using the OAuth 2.0 code grant flow

 Azure Active Directory (Azure AD) uses OAuth 2.0 to enable you to authorize access to web applications and web APIs in your Azure AD tenant. Register your application with your AD tenant First, register your application with your Azure Active Directory (Azure AD) tenant. This will give you an Application ID for your application, as well as enable it to receive tokens.

  • Sign in to the Azure portal.
  • Choose your Azure AD tenant by selecting your account in the top right corner of the page, followed by selecting the Switch Directory navigation and then selecting the appropriate tenant.
  • Skip this step if you only have one Azure AD tenant under your account, or if you’ve already selected the appropriate Azure AD tenant.
  • In the Azure portal, search for and select Azure Active Directory.
  • In the Azure Active Directory left menu, select App Registrations, and then select New registration.
  • Follow the prompts and create a new application. It doesn’t matter if it is a web application or a public client (mobile & desktop) application for this tutorial, but if you’d like specific examples for web applications or public client applications, check out our quickstarts.
    • Name is the application name and describes your application to end users.
    • Under Supported account types, select Accounts in any organizational directory and personal Microsoft accounts.
    • Provide the Redirect URI. For web applications, this is the base URL of your app where users can sign in. For example, http://localhost:12345. For public client (mobile & desktop), Azure AD uses it to return token responses. Enter a value specific to your application. For example, http://MyFirstAADApp.
  • Once you’ve completed registration, Azure AD will assign your application a unique client identifier (the Application ID). You need this value in the next sections, so copy it from the application page.
  • To find your application in the Azure portal, select App registrations, and then select View all applications.

OAuth 2.0 authorization flow

 At a high level, the entire authorization flow for an application looks a bit like this, 

Build Azure Domain And AD integration Foundation

Azure Advanced Threat Protection (ATP)

 Azure Advanced Threat Protection (ATP) is a cloud-based security solution that leverages your on-premises Active Directory signals to identify, detect, and investigate advanced threats, compromised identities, and malicious insider actions directed at your organization. Azure ATP enables SecOp (Security Operation) analysts and security professionals to detect advanced attacks in hybrid environments in the following ways,

  • Monitors users, entity behavior, and activities with learning-based analytics.
  • Protects user identities and credentials stored in Active Directory.
  • Identifies and investigates suspicious user activities and advanced attacks throughout all phases of a cyberattack.
  • Provides clear incident information on a simple timeline for fast triage

Enterprise cloud directory Azure Active Directory is a comprehensive identity and access management solution in the cloud. It combines core directory services, advanced identity governance, security, and application access management. Azure AD makes it easy for your developers to build policy-based identity management into your organization’s applications. Azure AD Premium editions include additional features to meet the advanced identity and access needs of enterprise organizations, such as,

  • The ability for someone to sign in to thousands of applications, including on-premises business applications as well as cloud-based and consumer apps.
  • Multifactor authentication.
  • Conditional access based on group and location, or device state.
  • Azure IoT device-level authentication.
  • Access monitoring and logging.
  • Cloud App Discovery.
  • Self-Service Password Reset (SSPR).

Azure AD enables a single identity management capability across on-premises, cloud, and mobile solutions. 

Build Azure Domain And AD integration Foundation

  The Azure AD Premium P2 edition offers three important features,

  • Azure AD Identity Protection leverages the anomaly detection of Azure AD to detect anomalies in real time. It uses adaptive machine-learning algorithms and heuristics to detect indications that an identity has been compromised. With Azure AD Identity Protection, you can detect potential vulnerabilities affecting your organization’s identities, configure automated responses to detected suspicious actions that are related to your organization’s identities, investigate suspicious incidents, and take appropriate action to resolve them.
  • Azure AD Privileged Identity Management helps you manage, control, and monitor access within your organization, by identifying Azure AD administrators, enabling just-in-time administrative access to online services, and providing reports and alerts about administrative access.
  • Access reviews provide governance of identities to ensure users and administrators have the correct access to apps and resources over time. Access reviews enable IT organizations to review access to groups or resources and confirm they still need access to perform their tasks.

Multifactor authentication

 The use of multiple authentication factors reduces the risk of unauthorized user access, such as through phishing attacks, and Azure MFA works for both on-premises and cloud applications and across both in a hybrid configuration, helping to safeguard access to data and applications. It delivers strong authentication through a range of easy verification options—phone call, text message, or mobile app notification—allowing users to choose the method they prefer for both on-premises and cloud applications. Conditional access Users can access your organization’s resources by using a variety of devices and apps from anywhere, so just focusing on who can access a resource is not sufficient anymore. You need to make sure that these devices meet your standards for security and compliance. With Azure AD conditional access, you can make automated access-control decisions for accessing your cloud apps that are based on conditions such as device state, location, client application, and sign-in risk. Azure IoT device-level authentication Authentication applies to devices as well as users, especially in today’s Internet of Things (IoT). Azure IoT supports X.509 certificates for enhanced authentication at the device level. Device identity can be transmitted safely and securely from the edge to the cloud. You can use the IoT Hub device identity registry to configure per-device security credentials and access control using tokens. Azure IoT Hub grants access to endpoints by verifying a token against the shared access policies and identity registry security credentials. Security credentials, such as symmetric keys, are never sent over the wire. 

Architecture diagrams for AAD

 The following diagrams outline the high-level architecture components required for each authentication method you can use with your Azure AD hybrid identity solution. They provide an overview to help you compare the differences between the solutions.

  • Simplicity of a password hash synchronization solution,

    Build Azure Domain And AD integration Foundation
  • Agent requirements of pass-through authentication, using two agents for redundancy,

    Build Azure Domain And AD integration Foundation
  • Components required for federation in your perimeter and internal network of your organization,

    Build Azure Domain And AD integration Foundation

Conclusion

 This article outlines various authentication options that organizations can configure and deploy to support access to cloud apps. To meet various business, security, and technical requirements, organizations can choose between password hash synchronization, Pass-through Authentication, and federation. Consider each authentication method, like does the effort to deploy the solution, and the user’s experience of the sign-in process, address your business requirements? Evaluate whether your organization needs the advanced scenarios and business continuity features of each authentication method. Finally, IoTCoast2Coast evaluate the considerations of each authentication method as per business requirement and committed to implement best solutions as discussed above.

Azure deployment and Disaster Recovery for SAP workload

Introduction

 Microsoft Azure provides a trusted path to enterprise-ready innovation with SAP solutions in the cloud. Mission critical applications such as SAP run reliably on Azure, which is an enterprise proven platform offering hyperscale, high availability, agility, and cost savings for running a customer’s SAP landscape. As an organization you need to adopt a business continuity and disaster recovery (BCDR) strategy that keeps your data safe, and your apps and workloads up and running, when planned and unplanned outages occur. IoTCoast2Coast helps customers to build their SAP on Azure landscapes and very often we discuss the easiest way to get started. We always recommend improving the Disaster Recovery process by implementing Azure Site Recovery, which is Microsoft’s cloud service that replicates on-premise servers and creates a recovery plan to provision resources in the cloud in case of an unexpected event. During normal operations client have to pay only for the storage – target virtual machines are created during the failover process. Reliable and inexpensive disaster recovery solution and it’s today’s reality. Azure Site Recovery protects your workload by replication of the disks. The process is compatible with SAP NetWeaver products and supported by Microsoft. Azure Site Recovery performs the replication, however it can’t ensure the data consistency between the data and log areas. The recommended solution is to create a System Replication between the on-premise and cloud instances. It will ensure the lowest RPO and RTO, but it requires a constantly running server in the cloud environment. As best practice, when it comes to protecting the HANA instance, we present an alternative solution based on the automatic shipping of data backups to Azure Blob storage. It will require some extra actions, but the total cost of the solution is much lower – the client only pays for the space being used. 

Setup And Configure SAP Backups And Disaster Recovery

Azure VM DBMS deployment for SAP workload

 Azure has two different deployment models we can use to create and work with resources,

  • Azure Resource Manager
  • Classic

IoTCoast2Coast recommend the Resource Manager deployment model for new deployments instead of the classic deployment model. 

Setup And Configure SAP Backups And Disaster Recovery

Prerequisite

  1. Azure Subscription
  2. Basic Azure knowledge
  3. SAP knowledge
  4. Administrator Access
  5. PowerShell (Good to have)
  6. Understanding of SAP HANA administration

Definition Throughout the document, these terms are used,

  • IaaS: Infrastructure as a service.
  • PaaS: Platform as a service.
  • SaaS: Software as a service.

SAP Azure Disaster Recovery Solution (Scenario)

 Let’s implement Azure Disaster Recovery Solution. It’s important to establish a rough vision of the end state before taking the first step. It’s not the company starting point, but it shows the potential destination. Assumptions / Challenge In the previous environment, SAP was running on-premise with no/ on-prem DR capability. This exposed the company to business risks:

  • Complete production shutdown on a global level if the SAP instance in the headquarters goes down.
  • Loss of data/other transactional information in case of data corruption.
  • High time for business recovery leading to production losses globally.
  • High risk of both infrastructure (Servers, Network, Storage, Backup failure) as well as application issues leading to unplanned business downtime

The Solution Let’s set up a recovery strategy for SAP using Cloud services on Microsoft Azure Cloud. This is intended to enable the client to have an RTO (Recovery Time Objective) of 4 hours and RPO (Recovery Point Objective) of 2 hours. The same could be configured as an on-demand DR solution, allowing the customer to pay for DR services on an as-needed basis. 

Highlights of the Solution

Setup And Configure SAP Backups And Disaster Recovery

The Key Benefit IoTCoast2Coast implemented best practices for the architecture and business logic on Azure, and through the implementation of architectural best practices and the business logic on Azure, the following benefits were realized by the customer:

  • Higher Agility
  • Reduced RPO/RTO for DR
  • On demand provisioning of non-production environments from production backup on need basis
  • Consumption based pricing for Disaster Recovery
  • Improved Monitoring
  • Dynamic Scaling

    Setup And Configure SAP Backups And Disaster Recovery

Considerations for Azure VM DBMS deployment for SAP workload

 Let’s undiscover the generic deployment aspects of SAP-related DBMS systems on Microsoft Azure virtual machines (VMs) by using the Azure infrastructure as a service (IaaS) capabilities. It complements the SAP installation documentation and SAP Notes, which represent the primary resources for installations and deployments of SAP software on given platforms. Considerations of running SAP-related DBMS systems in Azure VMs are introduced. There are few references to specific DBMS systems in this chapter. Instead, the specific DBMS systems are handled. SAP component An individual SAP application such as ERP Central Component (ECC), Business Warehouse (BW), Solution Manager, or Enterprise Portal (EP). SAP components can be based on traditional ABAP or Java technologies or on a non-NetWeaver-based application such as Business Objects. SAP environment One or more SAP components logically grouped to perform a business function such as development, quality assurance, training, disaster recovery, or production. SAP landscape This term refers to the entire SAP assets in a customer’s IT landscape. The SAP landscape includes all production and nonproduction environments. SAP system The combination of a DBMS layer and an application layer of, for example, an SAP ERP development system, an SAP Business Warehouse test system, or an SAP CRM production system. In Azure deployments, dividing these two layers between on-premises and Azure isn’t supported. As a result, an SAP system is either deployed on-premises or it’s deployed in Azure. You can deploy the different systems of an SAP landscape in Azure or on-premises. For example, you could deploy the SAP CRM development and test systems in Azure but deploy the SAP CRM production system on-premises. Cross-premises Describes a scenario where VMs are deployed to an Azure subscription that has site-to-site, multisite, or Azure ExpressRoute connectivity between the on-premises data centers and Azure. In common Azure documentation, these kinds of deployments are also described as cross-premises scenarios. 

SAP Site Recovery

 The Site Recovery provides a disaster recovery solution for on-premises machines, and for Azure VMs. You replicate machines from a primary location to a secondary. When disaster strikes, you fail machines over to the secondary location, and access them from there. When everything’s up and running normally again, you fail machines back to recover them in the primary site. 

Azure Recovery Services contribute to your BCDR strategy

Site Recovery service Site Recovery helps ensure business continuity by keeping business apps and workloads running during outages. Site Recovery replicates workloads running on physical and virtual machines (VMs) from a primary site to a secondary location. When an outage occurs at your primary site, you fail over to a secondary location, and access apps from there. After the primary location is running again, you can fail back to it. Backup service The Azure Backup service keeps your data safe and recoverable by backing it up to Azure. Site Recovery can manage replication for:

  • Azure VMs replicating between Azure regions.
  • On-premises VMs, Azure Stack VMs and physical servers.

Resiliency/Reliability

 Azure keeps your applications up and running and your data available. Azure is the first cloud platform to provide a built-in backup and disaster recovery solution.Resiliency is not about avoiding failures but responding to failures. The objective is to respond to failure in a way that avoids downtime and data loss. Business continuity and data protection are critical issues for today’s organizations, and business continuity is built on the foundation of resilient systems, applications, and data. Reliability and resiliency are closely related. Reliability is defined as dependability and performing consistently well. Resiliency is defined as the capacity to recover quickly. Together, these two qualities are key to a trustworthy cloud service. Despite best efforts, disasters happen; they are inevitable but mostly unpredictable, and vary in type and magnitude. There is almost never a single root cause of a major issue. Instead, there are several contributing factors, which is the reason an issue is able to circumvent various layers of mitigations/defenses. 

Disaster Recovery

 Disaster recovery strategy is key to business continuity. Site recovery and data backup are elements of a disaster recovery plan. Organizations using the cloud tend to take the reliability of the public cloud for granted, not recognizing that they may be responsible for choosing and implementing backup and recovery mechanisms. As a cloud customer, you will confront more opportunities to spend extra time and money on optional backup than you can ever take advantage of, so you need to make explicit and careful choices as to what you will and will not do. Your disaster recovery plan should,

  1. Identify and classify the threats and risks that may lead to disasters.
  2. Define the resources and processes that ensure business continuity during the disaster.
  3. Define the reconstitution mechanism to get the business back to normal from the disaster recovery state, after the effects of the disaster are mitigated.

An effective disaster recovery plan plays its role in all stages of operations and it is continuously improved by disaster recovery mock drills and feedback capture processes. Disaster recovery happens in the following sequential phases,

  1. Activation Phase
    In this phase, the disaster effects are assessed and announced.
  2. Execution Phase
    In this phase, the actual procedures to recover each of the disaster-affected Azure services are executed. Business operations are restored into the Azure paired region.
  3. Reconstitution Phase
    In this phase the original Azure region hosted system/service is restored, and execution phase procedures are stopped.