AZ-120 Microsoft Azure SAP – Migrate SAP Workloads to Azure
1. Introduction
Hello and welcome back to the Microsoft Azure for SAP workloads course. In this section, we will be looking at how we can migrate existing SAP workloads to Azure. So let’s get started. I’m your host, Ricolai Capitalisco, founder of Reteam Labs. This course is brought to you in partnership with Sam Kanjar, senior Azure cloud solution architect at Microsoft. There are six sections in total in this course covering all of the AZ 120 exam objectives. The course focuses on the Azure and SAP related topics and any fundamental or prior knowledge will be assumed and referenced where appropriate. This current section, four of this course will cover the migration steps and the necessary considerations when running an inventory of an environment in scope of a migration.
We will also be looking at how that translates to Azure for both SAP NetWeaver and SAP Hana by looking at the different strategies for the migration. So let’s proceed. This section builds on previous sections of this course, so please make sure you understand the previous ones before continuing with this one. During this section, we will be looking at migrating an existing SAP landscape to Azure, what you need to look for and how to plan your migration approach. Every SAP system is different, so understanding how the SAP landscape and scope was designed via running an inventory will help you plan your migration route.
2. SAP Sizing
The most important part in any migration is understanding what you are planning to migrate and to account for dependencies and limitations or even blockers that might stop your migration in its tracks. Following an appropriate inventory process will ensure that your migration completes successfully. We recommend to initially leverage inhouse tools to understand the current SAP landscape that that is in scope of your migration, or perhaps to look at your ServiceNow or CMDB catalog that might reveal some of the data that expresses your SAP system, then taking that information to start drawing out your sizing in Azure. You should look at the SAP node number 192-8533 for even more details.
It’s very important to ensure that we have a record of the current environment configuration, such as what servers are present and their names and numbers, what roles they’re running, and data about CPU and memory. It is also important to pick up on their disk sizes, disk configuration and throughput to ensure that you design a system for better experience in Azure. It’s also essential to understand database replication and throughput requirements. Around replications, please make sure to have a detailed inventory of OSDB kernel and SAP support pack versions. SAP support for a given configuration in onpremises scenarios does not imply that the same configuration is supported on Azure VMs. Depending on the outcome, you might have to upgrade some of the software components. For details regarding supported configuration, you need to visit the following SAP notes 192-5533 for SAP sizing for SAP supported Azure VM SKUs 203-9619 this one provides Oracle support on Azure. 223-5581 this one provides support matrix for SAP Hana on different OS releases.
Please note all of the above are absolutely critical when sizing your VMs correctly, so please do take the time to go through them when performing a migration. Sizing for Hana large distances is no different than Sizing for Hana in general. For existing and deployed systems that you want to move from other RDBMS to Hana, SAP provides a number of reports that run on your existing SAP systems. If the databases move to Hana, these reports check the data and calculate memory requirements for the Hana instance.
For more information on how to run these reports, and to also obtain their most recent patches or versions, you should ensure to read the following SAP notes 179-3345 Sizing for SAP Suite on Hana. 187-2170 sweet on Hana and s four. Hana sizing report. 173-6976 Sizing. Report for BW on Hana. Please note that the memory requirements increase as the data volume grows, so understanding the current memory consumption will put you on the right track for predicting what you need for Azure and for the future.
3. Network Inventory
From a networking perspective. You also need to understand how your SAP landscape looks like onpremise in order to translate it accordingly to the cloud. Please note that this isn’t a one to one translation of networking and configuration, but it is a clear view of how your systems are interacting together. This will give you an even better approach to segmenting your Azure networking and better ways to design your security controls, such as NSGs to control your traffic flow. The aim is to ensure no lateral movement is allowed between systems in case they become compromised. One other important piece is the interconnectivity to on premise.
An SAP system interacts with so many other applications, and it’s the core of a lot of businesses. Hence, as we run through our audit, we need to understand how it is interacting with other systems in order to ensure that communication is still available after migration. Of course, it is recommended to use Express Route for hybrid connectivity to ensure enough throughput is assigned to the service, including the SLA that Microsoft gives against Express Route. Like any other core service, SAP relies heavily on Active Directory and DNS. It’s the spine that pulls everything together and allows people and services to authenticate and access services across the SAP landscape. DNS is also important because it ensures that services can resolve each other.
So, in the light of what we have just said, both Active Directory and DNS will be extended to Azure to minimize the response time in order to ensure SAP functions in an optimum way without relying on on premise services. Another important process in planning your SAP landscape on Azure, especially if you want to make sure connectivity is not impacted through Express Route, is the appropriate design of your IP addressing. First, you need to set aside the 29 IP range for your Express Route connection and possibly the 16 masks for Azure and or the 24 cedar block for Server IP pool. In case of HLI, please do bear in mind that the first 30 addresses will be reserved for internal Hana usage. Also for HLI. Please make sure you opt for Express Rule global reach. Please make a note of the subscription ID to assign the HLI to region of deployments to account for data server IP address range, fully qualified host names and SAP Hana Sydney which will be used for NFS volumes, one for each Hana instance.
4. HLI Validation
For type two units only. The Slash twelve SP two OS version is currently supported. The swap space of the delivered OS image is set to 2GB according to the SAP Support note 19997 every Hana Large instance unit comes with two or three IP addresses that are assigned to two or three nick parts. Three IP addresses are used in Hana scale out configurations and the Hana system replication scenario. For deployment cases of Hana System replication or Hana scale out, a blade configuration with two IP addresses assigned is not suitable. If having two IP addresses assigned only and wanting to deploy such a configuration, contact SAP Hana on Azure Service Management to get a third IP address in a third villa unassigned.
The storage layout for SAP Hana on Azure large instances is configured by SAP Hana on Azure service management through SAP recommended guidelines. Hana Usrsap share the same volume. The naming convention of the mount points includes the system ID of the Hana instances as well as the mount number. In scale up deployments, there is only one mount such as MNT One in scale out deployments. On the other hand, you see as many mounts as you have worker and leading nodes for scale out environments. Data log and log backup volumes are shared and attached to each node in the ScaleOut configuration. For configurations that are multiple SCCP instances, a different set of volumes is created and attached to the Hana Large instance unit. When you look at a Hana Large instance unit, you realize that the units come in with generous disk volume for Hana data and that there is a volume Hana log backup. The reason that the Hana data is so large is that the storage snapshots over to customers are using the same disk volume.
The more storage snapshots you perform, the more space is consumed by snapshots in your assigned storage volumes. The Hana log backup volume is not supposed to be the volume for database backups. It is sized to be used as the backup volume for the Hana transaction log backups. For more information, please see SAP Hana, large instances, High Availability and Dr. Docs on Azure. In addition to the storage that is provided, you can purchase additional storage capacity in 1 TB Increment. This additional storage can be added as new volumes to a Hana Large instance.
SAP applications that are built on the SAP Net Weaver architecture are sensitive to time differences for the various components that make up the overall SAP system. For SAP, Hana and Azure Large instances, time synchronization that’s done in Azure doesn’t apply to the compute units in the large instance stamps. This synchronization is not applicable for running SAP applications in native Azure VMs because Azure ensures that a system’s time is properly synchronized. As a result, you must set up a separate time server that can be used by SAP application servers that are running on Azure VMs and also by the SAP Hana database instances. That are running on Hana. Large instances, the storage infrastructure in large instance, stamps is time synchronized with NTP servers.
5. Network Security and OS Validation
For network security, we recommend that you consider implementing a perimeter network with a managed or hosted firewall in front of the subnet for the Web dispatcher for storage security, we recommend that you ensure that data is encrypted in transit and at rest. To encrypt Azure VM disks. You can use Azure Disk Encryption. This feature uses uses the BitLocker feature of Windows and dmcrypt for Linux to provide volume encryption for the operating system and the data disks. The solution also works with Azure Keyboard to help you control and manage the disk encryption keys and secrets. Data on the Virtual Machine disks is encrypted at rest in Azure storage.
For SAP Hana data addressed encryption. We recommend using the SAP Hana native encryption technology. Important please do not use the Hana Data Addressed Encryption with Azure Disk encryption on the same server. For Hana use only. Hana Data Encryption. A Jumpbox is just a secure virtual Machine on the network, which is used by administrators to connect to other Virtual Machines on the network. Azure Bastion is a service you deploy that lets you connect to a virtual Machine using your browser and the Azure Portal.
The Azure Bastion service is a fully platform managed pad service that you provision inside your virtual network. It provides secure and seamless RDP or SSH connectivity to your virtual machines directly from the Azure Portal over TLS. When you connect via Azure Bastion, your Virtual Machines do not need a public IP address, agent or special client software. A traditional self deployed jump box is another alternative. If you require access to management tools such as SQL Server Management or SAP front end, let’s further consolidate your Justin time access knowledge. When a user requests access to a VM, the Security Center checks that the user has Azure RBAC permissions for that VM. If the request is approved, the Security Center configures the NSGs and Azure Firewall to allow inbound traffic to the selected ports from the relevant IP address or range for the amount of time that was specified. After the time has expired, the Security Center restores the NSGs to their previous states. Connections that are already established are not interrupted.
Let’s now spend some time looking at supported OS deployments of SAP on Azure. Azure VMs there is a growing number of Azure VMs queues certified for hosting SAP Hana. This includes GS five and a number of M family VM sizes with m 20 eight MSB, two featuring 5. 7 terabytes of memory and with m 128s supporting ScaleOut configuration. There is also a much larger selection of Azure VM SKUs that supports non Hana workloads Net Weaver and nonnetweaver products. SAP Hana on Azure large instances, there are several SKUs ranging from two terabytes per node s one nine two with scale of support to 20 terabytes per node s 960 M, and this is split to two different classes of hardware. SAP Cloud Appliance Library Kal SAP Kale offers you to deploy preconfigured software appliances on different public clouds, including Azure.
The primary benefit of SAP call is that it provides an easy way to deploy and test pre configured SAP solutions offered by SAP without having to provision underlying infrastructure. SAP. Call on Azure includes support for SAPS for hana or BW for hana. SAP notes 192-8533 provides an up to date listing of Azure VM SKUs supported for nonfan RDBMS platforms serving SAP workloads. It’s also very important to ensure that the operating system is set up correctly utilizing Sapcomf, which is a minimalistic tool to prepare your system for an SAP workload. It helps you calculate and set the Linux kernel parameters and as long as you use Sapcomf, Systad will be included and started with it, as well as both Tuned and Quid tools. You can adjust the Sapcom settings by modifying the following configuration, integration file, etc ipconfig second and to alter tune config you need to modify the following file usrlipstune tune.
Interesting posts
The Growing Demand for IT Certifications in the Fintech Industry
The fintech industry is experiencing an unprecedented boom, driven by the relentless pace of technological innovation and the increasing integration of financial services with digital platforms. As the lines between finance and technology blur, the need for highly skilled professionals who can navigate both worlds is greater than ever. One of the most effective ways… Read More »
CompTIA Security+ vs. CEH: Entry-Level Cybersecurity Certifications Compared
In today’s digital world, cybersecurity is no longer just a technical concern; it’s a critical business priority. With cyber threats evolving rapidly, organizations of all sizes are seeking skilled professionals to protect their digital assets. For those looking to break into the cybersecurity field, earning a certification is a great way to validate your skills… Read More »
The Evolving Role of ITIL: What’s New in ITIL 4 Managing Professional Transition Exam?
If you’ve been in the IT service management (ITSM) world for a while, you’ve probably heard of ITIL – the framework that’s been guiding IT professionals in delivering high-quality services for decades. The Information Technology Infrastructure Library (ITIL) has evolved significantly over the years, and its latest iteration, ITIL 4, marks a substantial shift in… Read More »
SASE and Zero Trust: How New Security Architectures are Shaping Cisco’s CyberOps Certification
As cybersecurity threats become increasingly sophisticated and pervasive, traditional security models are proving inadequate for today’s complex digital environments. To address these challenges, modern security frameworks such as SASE (Secure Access Service Edge) and Zero Trust are revolutionizing how organizations protect their networks and data. Recognizing the shift towards these advanced security architectures, Cisco has… Read More »
CompTIA’s CASP+ (CAS-004) Gets Tougher: What’s New in Advanced Security Practitioner Certification?
The cybersecurity landscape is constantly evolving, and with it, the certifications that validate the expertise of security professionals must adapt to address new challenges and technologies. CompTIA’s CASP+ (CompTIA Advanced Security Practitioner) certification has long been a hallmark of advanced knowledge in cybersecurity, distinguishing those who are capable of designing, implementing, and managing enterprise-level security… Read More »
Azure DevOps Engineer Expert Certification: What’s Changed in the New AZ-400 Exam Blueprint?
The cloud landscape is evolving at a breakneck pace, and with it, the certifications that validate an IT professional’s skills. One such certification is the Microsoft Certified: DevOps Engineer Expert, which is validated through the AZ-400 exam. This exam has undergone significant changes to reflect the latest trends, tools, and methodologies in the DevOps world.… Read More »