- Home
- Microsoft Certifications
- AZ-120 Planning and Administering Microsoft Azure for SAP Workloads Dumps
Pass Microsoft Azure SAP AZ-120 Exam in First Attempt Guaranteed!
Get 100% Latest Exam Questions, Accurate & Verified Answers to Pass the Actual Exam!
30 Days Free Updates, Instant Download!
AZ-120 Premium Bundle
- Premium File 326 Questions & Answers. Last update: Dec 16, 2024
- Training Course 87 Video Lectures
Last Week Results!
Includes question types found on the actual exam such as drag and drop, simulation, type-in and fill-in-the-blank.
Based on real-life scenarios similar to those encountered in the exam, allowing you to learn by working with real equipment.
All Microsoft Azure SAP AZ-120 certification exam dumps, study guide, training courses are Prepared by industry experts. PrepAway's ETE files povide the AZ-120 Planning and Administering Microsoft Azure for SAP Workloads practice test questions and answers & exam dumps, study guide and training courses help you study and pass hassle-free!
Design an Azure Solution to Support SAP Workloads
6. Security Recommendations
Security is another core pillar of any design. When we talk about security, this usually involves identity, network, and data security. We will start our discussion with role-based access control, or RBAC. This is accessed through your Azure management panel. RBAC is backed up by Azure and uses cloud-only or synchronised identities. That is the case when synchronising on-premises Active Directory to Azure AD. Using Azure Ad Connect, RBAC will tie in those Cloud or Sync identities to your Azure tenant so you can give your IT personnel access to Azure for operational purposes.
RBAC governs your staff's access, so please make sure that you exercise these privileges where appropriate in order to tighten security around your resources. Please also ensure appropriate monitoring and auditing are in place in order to enforce governance and compliance. In addition to constant assessment of your RBACgroup membership, NSGs are a vital piece for securing your intra- and inter-network traffic. NSGs are stateful firewalls, which means that they preserve session information. You can have a single NSG per subnet, and multiple subnets can share the same NSG. It's important that you understand how traffic flows between each SAP component in order to be able to design those NSGs appropriately. Now, you may think of an application security group, or ASG.
Consider application security groups to be virtual machines that perform the same function, such as webservers, application servers, or backend database servers that run a meaningful service. This way, you can simplify your security network policies and make them part of your design thinking. What I mean is that instead of thinking of network traffic based on IPS and subnets, you start thinking of them in terms of roles and target ports. So, GS and LSGs are supported, but avoid having network virtual appliances such as firewalls between application and DBMS servers. Now, resource encryption is another powerful piece of the puzzle. This brings encryption into transit and adds rest to the design thinking.
The behaviour and responsiveness of your application differ in both scenarios. SAP recommends using encryption at rest, so for Azure Storage accounts, we can use Storage Service Encryption, which would use either Microsoft or customer-managed keys to manage encryption. Azure Storage accounts also add encryption in transit with SSL using HTTPS traffic. For SSL certificate protection, it's recommended to use Azurekey vaults to protect your keys and ensure that they are backed up using your Azure backup. This will protect your resources and the keys to your kingdom. Azure disc encryption is recommended for use wherever possible. However, if you are already using a DBMS encryption method, then it doesn't make sense to encrypt twice. So, for the sequel side of things, use ads for OS and relevant DB encryption like TDE.
7. Compare and Contrast: VM vs. bare-metal
Going bare metal as opposed to virtual gives you dedicated hardware to run your system on, configured to TDI specifications and purpose-built for the most demanding SAP Hana production workloads.
However, irrespective of which way you go, bare metal or VM, there are things that you need to think about. For certified Azure VMs, you will need to ensure that you use both proximity placement groups. As discussed previously, with accelerated networking, this enables single root I virtualization for your virtual machine, which in turn improves networking performance.
It bypasses the host on the data path. With bare metal servers, you have a dedicated network interface with no virtualization layer. Also, please pay careful attention to your VM SKUs, as they dictate the number of nicks you can attach and also the expected network performance in Mbits per second. The bandwidth is cumulative for all outbound traffic from the VM. Bearing this in mind, accelerated networking will help in achieving the published limit, but it won't help in exceeding it.
Also, VM SKUs dictate the total number of IOPS and throughput you can drive from that VM. For SAP systems, you need to make sure that you choose a certified SKU that is authorised by SAP and Microsoft to ensure that it delivers the performance metrics you're after. which brings me nicely to the SAP Quicksizer.
If you don't yet have systems or workloads in Azure, we recommend that you start with the SAP Quicksizer. It's an online app that guides you through sizing requirements based on your business needs. The SAP Quicksizer is good for capacity and budget planning. The app has a questionnaire where you indicate the number of SAP users, how many transactions you expect, and other relevant details.
The SAP system recommends a number for the SAP application performance standards, a measurement of the processing requirements that you need, such as for a database server. For example, if the recommended number is 80,000, you find servers with SAPS that add up to 800. More information about SAPS for Azure VMs can be found at SAP node 192-8533.
However, there are a few points that you should keep in mind. With Quicksizer, there can be customizations and variations of SAP systems depending on business processes, which could change system behavior. Or say you have capabilities enabled for new SAP deployments or custom code for which there is no Quick Sizzle.
Also, in the past, hardware vendors guided customers on what services they needed and how to install them. With Azure. Customers, of course, make their own decisions. For example, how to increase storage capacity as data volume increases, or how to adjust CPU compute resources. This brings us to the next point in this discussion.
As previously noted, Azure VM sizes have a hard IOPS limit, which means that no matter how fast your storage is on the back end, your VM dictates your maximum IOPS, so make sure that you choose wisely. This table on the screen shows IOPS and throughput from an ultradisc. Those numbers can be set separately. However, it is important to remember that, as previously stated, an ultra disc cannot be attached to any VM.
8. Further Networking Considerations
We talked about VM SKUs and sizes in previous slides. However, we need to reiterate that in order to derive the required Diops from storage, you need to ensure that you select the right VM size for the required Diops. There is an extra aspect to consider for SAP. VMs need greater network throughput because the application is very chatty with the DBMS. Hence, you need to make sure you have enough throughput. The way to ensure that you get the best throughput out of an Azure VM is by selecting an SAP-certified VM that can have accelerated networking turned on. SAP applications are very sensitive to network errors and latency.
So as discussed, just to bring things together, we need to bear in mind the following when considering our network design: because SAP is a very chatty application, you should keep everything in a single VNet segmented appropriately with subnets and NSGs. Having an SAP landscape scattered over multiple VNETs that are not grouped is not supported.
Do not place an NVA or firewall between the application and DB servers because you need to keep latency to a minimum. PPG is an important design consideration for an SAP landscape as it places everything as close as possible together to avoid passing traffic between different buildings in a single region. And finally, please remember to use accelerated networking.
9. Scaling and Versions
When it comes to scaling an application, CS, or DB server, there are two options: vertical scaling (also known as scaling up) or horizontal scaling (also known as scaling down). This is accomplished by increasing or decreasing the VM size as needed. This is mostly appropriate for SAP databases. This is where you add more cores and RAM to an existing VM. This kind of scaling requires a machine reboot; hence, it will affect your service. Vertical scaling is supported on app CS and DB servers. The next type of scaling is horizontal, which is also referred to as "scaling out" or "scaling in." This is achieved by adding or removing virtual machine instances.
This approach is better suited for SAP application servers. To enable this type of scaling, you will need to ensure that any session data needs to be stored outside of the VM. Otherwise, if you scale down in the case of an Azure scale set, for instance, you would lose any data on that VM. Configuring host auto failover via SAP Hana scale-out configuration is one method for achieving Hana high availability. To enable host auto failover, add one or more virtual machines to the Hana system and set them to standby mode. Note that when an active node fails, a standby node automatically takes over in this type of configuration. with Azure Virtual Machines. You achieve automatic failover by using NFS on Azure. NetApp files. A couple of things to remember with SAP Hana scale-out configuration The standby node needs access to all database volumes. The Hana volumes must be mounted as NFS v. 4 volumes.
The improved file-lease-based locking system in the NFSv4 protocol is used for IO fencing. Furthermore, NFS version 4 volumes must be used for both data and logs mounted using the NFS version 4 protocol, as NFS version 3 is not supported in this configuration. In the case of NADB, the database can scale up and down but does not scale out.
The SAP database container for any DB does not support sharing. Let's look at SAP-supported platforms and versions of SAP that you can run on these OS types. For SQL Server and Oracle on the Windows OS platform, the same minimum releases apply as for SAP NetWeaver. Please see SAP support note 192-8533 for more details. For SAP Hana on Red Hat and SuzyLinux, SAP Hana's certified VMs are required. As noted earlier, you can also install SAP.Net Weaver on Oracle Linux.
10. Storage Considerations (VM)
SAP has specific requirements for its backend disc storage. For an Azure VM, it supports both Premium and Ultra SSD disks, which can give greater IOPS and throughput for NFS storage. There are two ways to achieve such a solution. First, particularly for a cost-conscious organization, it's using native Azure NFS shares that are based on Azure storage.
Make sure that you have configured Azure with the right accelerator for your volumes that contain the DBMS transaction logs or redo logs. When you are using M series or MV 2 series VMs, be aware of the limitations for write accelerators on Azure Premium or Ultra SSD disks. Caching should be set to none or read-only, and snapshots are not supported on these disks. Azure NetApp files provide native NFS shares that can be used for the Hanashared Hana data and Hana log volumes. Using NFS-based NFS shares for the Hanadata and the Hana log volumes requires the usage of the V4-1 NFS protocol.
The NFS protocol version three is not supported for the usage of Hana data and Hana log volumes when basing the shares on ANF in the case of SMB storage, but we can use a scale-out file server cluster with Storage Spaces Direct running on top of Windows servers or ENF using SMB. As for HLI, the storage tier is provisioned as part of the HLI provisioning process, and NetApp is also an option for that.
Now let's look at specific Azure VM requirements. Paging or swap files can be placed on the D drive on Windows or MNT resources on Linux, respectively. It is recommended to use managed discs for all SAP workloads. Write accelerators can be switched on for managed discs running on the M series. Azure VMs: please always go for premium storage as a minimum. You can also stripe across multiple discs to get the storage size and throughput that you need.
When it comes to Azure VMdisk Caching, there are some requirements that must be met. Hana data no caching or only read caching; Hana logno caching except for M and MV two series VMs where Azure write accelerator should be enabled; Hana shared read caching OS disk; do not change the default caching that Azure sets when the VM is created. You may choose to look at SAP Node 20155Three for more information on VM storage requirements.
Microsoft Azure SAP AZ-120 practice test questions and answers, training course, study guide are uploaded in ETE Files format by real users. Study and Pass AZ-120 Planning and Administering Microsoft Azure for SAP Workloads certification exam dumps & practice test questions and answers are to help students.
Exam Comments * The most recent comment are on top
Purchase AZ-120 Exam Training Products Individually
Why customers love us?
What do our customers say?
The resources provided for the Microsoft certification exam were exceptional. The exam dumps and video courses offered clear and concise explanations of each topic. I felt thoroughly prepared for the AZ-120 test and passed with ease.
Studying for the Microsoft certification exam was a breeze with the comprehensive materials from this site. The detailed study guides and accurate exam dumps helped me understand every concept. I aced the AZ-120 exam on my first try!
I was impressed with the quality of the AZ-120 preparation materials for the Microsoft certification exam. The video courses were engaging, and the study guides covered all the essential topics. These resources made a significant difference in my study routine and overall performance. I went into the exam feeling confident and well-prepared.
The AZ-120 materials for the Microsoft certification exam were invaluable. They provided detailed, concise explanations for each topic, helping me grasp the entire syllabus. After studying with these resources, I was able to tackle the final test questions confidently and successfully.
Thanks to the comprehensive study guides and video courses, I aced the AZ-120 exam. The exam dumps were spot on and helped me understand the types of questions to expect. The certification exam was much less intimidating thanks to their excellent prep materials. So, I highly recommend their services for anyone preparing for this certification exam.
Achieving my Microsoft certification was a seamless experience. The detailed study guide and practice questions ensured I was fully prepared for AZ-120. The customer support was responsive and helpful throughout my journey. Highly recommend their services for anyone preparing for their certification test.
I couldn't be happier with my certification results! The study materials were comprehensive and easy to understand, making my preparation for the AZ-120 stress-free. Using these resources, I was able to pass my exam on the first attempt. They are a must-have for anyone serious about advancing their career.
The practice exams were incredibly helpful in familiarizing me with the actual test format. I felt confident and well-prepared going into my AZ-120 certification exam. The support and guidance provided were top-notch. I couldn't have obtained my Microsoft certification without these amazing tools!
The materials provided for the AZ-120 were comprehensive and very well-structured. The practice tests were particularly useful in building my confidence and understanding the exam format. After using these materials, I felt well-prepared and was able to solve all the questions on the final test with ease. Passing the certification exam was a huge relief! I feel much more competent in my role. Thank you!
The certification prep was excellent. The content was up-to-date and aligned perfectly with the exam requirements. I appreciated the clear explanations and real-world examples that made complex topics easier to grasp. I passed AZ-120 successfully. It was a game-changer for my career in IT!
I'll wait for un update