Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

AZ 304 Exam - Free Actual Q&As, Page 1 ExamTopics 4Th Feb

Download as pdf or txt
Download as pdf or txt
You are on page 1of 197
At a glance
Powered by AI
The document provides example questions and answers for the Microsoft Azure Solutions Architect Expert certification (AZ-304) exam. It includes topics related to Azure services, security, networking and identity.

The Azure Activity Log should be included in the recommendation as it provides information about operations taken on resources in a subscription including when the operation occurred and its status.

Query Performance Insight should be included in the recommendation as it helps identify the top resource consuming and long-running queries in a workload to optimize performance.

2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

- Expert Veri ed, Online, Free.

 Custom View Settings

Topic 1 - Question Set 1

Question #1 Topic 1

You need to recommend a solution to generate a monthly report of all the new Azure Resource Manager resource deployments in your
subscription.
What should you include in the recommendation?

A. the Change Tracking management solution

B. Application Insights

C. Azure Monitor action groups

D. Azure Activity Log

Correct Answer: D
Activity logs are kept for 90 days. You can query for any range of dates, as long as the starting date isn't more than 90 days in the past.
Through activity logs, you can determine:
✑ what operations were taken on the resources in your subscription
✑ who started the operation
✑ when the operation occurred
✑ the status of the operation
✑ the values of other properties that might help you research the operation
Reference:
https://docs.microsoft.com/en-us/azure/azure-resource-manager/management/view-activity-logs

  airairo 2 months ago


Verified from Whizlabs.
upvoted 6 times

  Grudo 1 month ago


why are mods continually allowing these shill comments? does this site own whizlabs or something?
upvoted 2 times

  sanketshah 1 month ago


given answer is correct.
upvoted 1 times

  DatBroNZ 3 weeks, 1 day ago


Option "D"

A. the Change Tracking management solution - used on Automation


B. Application Insights - monitor live APPs
C. Azure Monitor action groups - notification preferences
D. Azure Activity Log - subscription-level events
upvoted 1 times

  glam 2 weeks ago


D. Azure Activity Log
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 1/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #2 Topic 1

You have an Azure subscription that contains an Azure SQL database named DB1.
Several queries that query the data in DB1 take a long time to execute.
You need to recommend a solution to identify the queries that take the longest to execute.
What should you include in the recommendation?

A. SQL Database Advisor

B. Azure Monitor

C. Performance Recommendations

D. Query Performance Insight

Correct Answer: D
Query Performance Insight provides intelligent query analysis for single and pooled databases. It helps identify the top resource consuming and
long-running queries in your workload. This helps you nd the queries to optimize to improve overall workload performance and e ciently use
the resource that you are paying for.
Reference:
https://docs.microsoft.com/en-us/azure/azure-sql/database/query-performance-insight-use

  speedminer 5 months ago


Per the provided link:
"Query Performance Insight provides intelligent query analysis for single and pooled databases. It helps identify the top resource consuming and
long-running queries in your workload. This helps you find the queries to optimize to improve overall workload performance and efficiently use the
resource that you are paying for."
upvoted 9 times

  sanketshah 1 month ago


D is correct
upvoted 1 times

  SHABS78 3 months, 3 weeks ago


Correct Answer
upvoted 3 times

  walidibrahim 3 months, 3 weeks ago


D is correct
upvoted 1 times

  glam 3 months, 1 week ago


D. Query Performance Insight
upvoted 1 times

  MRBEAST 3 months ago


D is correct
upvoted 1 times

  David_986969 2 months, 3 weeks ago


In the exam 11/11/2020
upvoted 4 times

  China_Men 1 month, 2 weeks ago


D is Correct
upvoted 1 times

  Farooqz 2 weeks, 6 days ago


Answer seems to be correct
upvoted 1 times

  Farooqz 2 weeks, 6 days ago


D is correct
https://docs.microsoft.com/en-us/azure/azure-sql/database/query-performance-insight-use
upvoted 1 times

  glam 2 weeks ago


D. Query Performance Insight
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 2/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

  suryareddy 1 week, 1 day ago


D is correct
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 3/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #3 Topic 1

HOTSPOT -
You have an Azure App Service Web App that includes Azure Blob storage and an Azure SQL Database instance. The application is instrumented
by using the
Application Insights SDK.
You need to design a monitoring solution for the web app.
Which Azure monitoring services should you use? To answer, select the appropriate Azure monitoring services in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 4/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Correct Answer:

Note: You can select Logs from either the Azure Monitor menu or the Log Analytics workspaces menu.
Reference:
https://docs.microsoft.com/en-us/azure/azure-monitor/log-query/log-query-overview

  airairo 2 months ago


1- Azure Log Analytics: You can send data about the application and resource usage to azure log analytics. You can then build queries on the
stored data.
https://docs.microsoft.com/en-us/azure/azure-monitor/log-query/log-analytics-tutorial

2- Azure Application Insights:


- Give the ability to visualize the relationships b/n application components. https://docs.microsoft.com/en-us/azure/azure-monitor/app/app-map?
tabs=net

- Give the ability to track requests and exceptions to specific lines of code from within the application. https://docs.microsoft.com/en-
us/azure/azure-monitor/app/visual-studio-codelens

- Give the ability to actually analyse how to uses return to an application and see how often they only select a particular drop-down value.
https://docs.microsoft.com/en-us/azure/azure-monitor/app/usage-retention
upvoted 6 times

  sanketshah 1 month ago


correct answer.
1) Azure Log Analytics
2) Azure Service Map
3) Azure Application Insights
4) Azure Application Insights
upvoted 8 times

  nexnexnex 3 weeks, 3 days ago


2) Service Map is for VMs, App Insights Application Map is for apps
upvoted 2 times

  AWS56 2 weeks, 4 days ago


"1- Azure Log Analytics:" --> Where is this ? Not there in the list of answers to pick
upvoted 1 times

  AWS56 2 weeks, 4 days ago


"1- Azure Log Analytics:" --> Where is this ? Not there in the list of answers to pick

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 5/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

upvoted 1 times

  bharatns 2 months ago


Last box should be Azure Application Insights, rest looks okay.
upvoted 8 times

  airairo 2 months ago


I searched again:

1) Azure Log Analytics


2) Azure Service Map
3) Azure Application Insights
4) Azure Application Insights
upvoted 20 times

  AWS56 2 weeks, 4 days ago


"1- Azure Log Analytics:" --> Where is this ? Not there in the list of answers to pick
upvoted 1 times

  Azure_Chief 1 month, 2 weeks ago


You had it right the first time, box 2 is App insights the application map. Service Map is for VMs.
upvoted 5 times

  Lexa 1 month, 2 weeks ago


Unfortunately, there is no option "Azure Log Analytics"
upvoted 1 times

  airairo 2 months ago


as its been discussed here:

https://www.examtopics.com/exams/microsoft/az-301/view/2/
upvoted 2 times

  Abbas 2 months ago


Answer for box 4 should be Azure Application Insights. See https://docs.microsoft.com/en-us/azure/azure-monitor/app/app-insights-overview.
upvoted 2 times

  ray1028 2 weeks, 6 days ago


see https://docs.microsoft.com/en-us/azure/architecture/reference-architectures/app-service-web-app/app-monitoring
upvoted 1 times

  Rayrichi 1 month, 2 weeks ago


Answer for box 4 - Azure Application Insights
upvoted 3 times

  jhoomtv 1 month, 1 week ago


Log Analytics, Insight, Insight, Insight
upvoted 4 times

  sejalo 3 weeks, 3 days ago


1) Azure Log Analytics
2) Azure Service Map
3) Azure Application Insights
4) Azure Application Insights
upvoted 1 times

  AWS56 2 weeks, 4 days ago


"1- Azure Log Analytics:" --> Where is this ? Not there in the list of answers to pick
From where is everyone coping this option. This is not part of the answer.
upvoted 1 times

  sejalo 5 days, 11 hours ago


Here it was mentioned So if there is no option for Azure Log Analytics then select Azure Monitor Logs
https://www.examtopics.com/exams/microsoft/az-301/view/2/
upvoted 1 times

  DatBroNZ 3 weeks, 1 day ago


I don't know why people keep saying Log Analytics if it isn't even an option on the question.

For me it is:

1. Azure Monitor Logs


2. Azure Sevice Map
3. Azure Application Insights
4. Azure Application Insights
upvoted 3 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 6/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

  jva 3 weeks, 1 day ago


2. Azure Service Map? I thought this product was only for Linux or Windows VM's and required an agent install. Not sure how it can be used for an
app service plan.
upvoted 1 times

  rizabeer 3 weeks ago


All four App insights as it satisfies all requirements. Azure Service Map is not correct as it is not for Web Apps
upvoted 1 times

  ray1028 2 weeks, 6 days ago


reference
https://docs.microsoft.com/en-us/azure/architecture/reference-architectures/app-service-web-app/app-monitoring
upvoted 1 times

  AWS56 2 weeks, 4 days ago


Guys, the comments above are jumbled.... these comments are for the below question
https://www.examtopics.com/exams/microsoft/az-301/view/2/
upvoted 1 times

  MadEgg 2 weeks, 3 days ago


1) Azure Monitor Logs
https://docs.microsoft.com/de-de/azure/azure-monitor/platform/data-platform-logs
2) Azure Application Insights
3) Azure Application Insights -> Application Map (Application Map is part of Azure Application Insights.)
https://docs.microsoft.com/de-de/azure/azure-monitor/app/app-map?tabs=net
4) Azure Application Insights
https://docs.microsoft.com/de-de/azure/azure-monitor/app/usage-overview

*2) NOT Azure Service Map because its just for VMs
* 4) NOT Azure Activity Log:
The Activity log is a platform log in Azure that provides insight into subscription-level events. This includes such information as when a resource is
modified or when a virtual machine is started.
https://docs.microsoft.com/de-de/azure/azure-monitor/platform/activity-log
upvoted 3 times

  MadEgg 2 weeks ago


Sry, I mixed up 2 and 3
upvoted 2 times

  glam 2 weeks ago


1. Azure log Analytics
2. Azure Application Insights (application map in App insights)
3. Azure Application Insights
4. Azure Application insights
upvoted 5 times

  WuWon 1 week, 3 days ago


Log Analytics, AI, AI, AI
upvoted 2 times

  mactone 1 week ago


1) Azure Log Analytics
2) Azure Service Map
3) Azure Application Insights
4) Azure Application Insights
upvoted 1 times

  JohnWick2020 4 days, 4 hours ago


For this question, my picks are:
Azure Monitor Logs
Azure Application Insights
Azure Application Insights
Azure Application Insights

Note: question is specific on web app so this negates Service Map, and Activity Logs captures user api activities in Azure and can't peer into a
resource, web app in this case.
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 7/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #4 Topic 1

You have an on-premises Hyper-V cluster. The cluster contains Hyper-V hosts that run Windows Server 2016 Datacenter. The hosts are licensed
under a
Microsoft Enterprise Agreement that has Software Assurance.
The Hyper-V cluster contains 30 virtual machines that run Windows Server 2012 R2. Each virtual machine runs a different workload. The
workloads have predictable consumption patterns.
You plan to replace the virtual machines with Azure virtual machines that run Windows Server 2016. The virtual machines will be sized according
to the consumption pattern of each workload.
You need to recommend a solution to minimize the compute costs of the Azure virtual machines.
Which two recommendations should you include in the solution? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

A. Con gure a spending limit in the Azure account center.

B. Create a virtual machine scale set that uses autoscaling.

C. Activate Azure Hybrid Bene t for the Azure virtual machines.

D. Purchase Azure Reserved Virtual Machine Instances for the Azure virtual machines.

E. Create a lab in Azure DevTest Labs and place the Azure virtual machines in the lab.

Correct Answer: CD
C: For customers with Software Assurance, Azure Hybrid Bene t for Windows Server allows you to use your on-premises Windows Server
licenses and run
Windows virtual machines on Azure at a reduced cost. You can use Azure Hybrid Bene t for Windows Server to deploy new virtual machines
with Windows OS.
D: With Azure Reserved VM Instances (RIs) you reserve virtual machines in advance and save up to 80 percent.
Reference:
https://azure.microsoft.com/en-us/pricing/reserved-vm-instances/ https://docs.microsoft.com/en-us/azure/virtual-machines/windows/hybrid-
use-bene t-licensing

  speedminer 5 months ago


Seems correct, both would reduce cost if used.
upvoted 10 times

  sanketshah 1 month ago


both answers are correct.
upvoted 1 times

  Samu74 4 months ago


Answer seems correct. Question is same as AZ-301 topic 1 question 3. https://www.examtopics.com/exams/microsoft/az-301/view/
upvoted 3 times

  SHABS78 3 months, 3 weeks ago


I believe both answers are correct
upvoted 1 times

  David_986969 2 months, 3 weeks ago


In the exam 11/11/2020
upvoted 7 times

  bharatns 2 months ago


Correct
upvoted 1 times

  Rayrichi 1 month, 2 weeks ago


Seems correct
upvoted 1 times

  SAMBIT 1 month ago


B&D. On premise has 2012...ask is to have 2016 so you can’t reuse the license
upvoted 1 times

  nexnexnex 3 weeks, 3 days ago

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 8/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

on-premise licensed by hypervisor licenses, and they are 2016


upvoted 1 times

  joshyz73 1 month ago


The question specifically states that you have Software Assurance on the licenses, which gives you upgrade rights. So, C and D are correct and
valid.
upvoted 4 times

  DatBroNZ 3 weeks, 1 day ago


Looks correct to me. Options C and D. Both will reduce costs. We can also answer by elimination:

Option A is something that doesn't exist.


Option B is used for load balancing VMs (auto increase/decrease of instances)
Option E is obviously wrong
upvoted 1 times

  ray1028 2 weeks, 6 days ago


CD is correct
upvoted 1 times

  glam 2 weeks ago


C. Activate Azure Hybrid Benefit for the Azure virtual machines.
D. Purchase Azure Reserved Virtual Machine Instances for the Azure virtual machines.
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 9/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #5 Topic 1

HOTSPOT -
You have an Azure subscription that contains the SQL servers on Azure shown in the following table.

The subscription contains the storage accounts shown in the following table.

You create the Azure SQL databases shown in the following table.

For each of the following statements, select Yes if the statement is true. Otherwise, select No.
NOTE: Each correct selection is worth one point.
Hot Area:

Correct Answer:

Box 1: Yes -
Be sure that the destination is in the same region as your database and server.

Box 2: No -

Box 3: No -
Reference:
https://docs.microsoft.com/en-us/azure/sql-database/sql-database-auditing

  idris 2 months ago


Correct answer.
Should be in the same region and Premium storage is not supported
upvoted 9 times

  Blimpy 16 hours, 48 minutes ago

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 10/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

I agree ,,

guys dont confuse yourselves, use the pricing calculator yourself and you will see that only storage is sold in those 3 tiers.

The question refers to the storage that the sql server is using ... and it is not supported for auditing purposes
upvoted 1 times

  Blimpy 16 hours, 36 minutes ago


sorry I meant i disagree .. the storage for sql server may be premium but the storage for auditing is blob storage which is supported and it is
in the same region as SQLsvr2 ... Y N Y should be correct
upvoted 1 times

  arseyam 2 months ago


Premium here refers to the SQL database tier not the storage.
The storage is blob storage which is standard tier so the answer should be Yes
upvoted 11 times

  NickGR 2 months ago


It is clear that Premium refers to SQL pricing tiers and not storage
upvoted 3 times

  neverdrop 1 month, 1 week ago


Wrong.
Quote:
"Premium storage is currently not supported."
From: https://docs.microsoft.com/en-us/azure/azure-sql/database/auditing-overview
upvoted 1 times

  Niro 4 days, 8 hours ago


Read the question - Premium refers to SQL Database, NOT storage.
upvoted 1 times

  arseyam 2 months ago


It should be Yes, No, Yes
upvoted 42 times

  mmmore 2 months ago


Agreed, Yes, No, Yes.
Premium points to the SQL Database, not the storage account. Also, Blobstorage and StorageV2 are valid options for storing auditing logs. I've
tested myself.
upvoted 11 times

  Abbas 2 months ago


Answer should be Yes No Yes. Premium refers to SQL pricing tier and auditing is available for basic, standard and premium tiers.
upvoted 4 times

  airairo 2 months ago


According to Whizlabs, the answers are correct:

Yes, No, No.


upvoted 3 times

  tarq 4 weeks, 1 day ago


Mistake you are doing in that in Whizlabs, storage 2 is in central US . here storage2 is West US. so, Yes, no, Yes is correct answer beacause " You
can store the audit information in Blob storage as long as the storage account is in the same location as the Azure SQL Server"
upvoted 4 times

  Rayrichi 1 month, 3 weeks ago


can you share the link to Whizlabs answer?
upvoted 1 times

  Grudo 1 month ago


this is a whizlabs bot trying to sell you whizlabs. They are literally shilling for whizlabs on every question. Go sign up yourself if you love
throwing money away
upvoted 2 times

  Atilgen 1 month, 4 weeks ago


Auditing limitations
Premium storage is currently not supported.
Hierarchical namespace for Azure Data Lake Storage Gen2 storage account is currently not supported.
Enabling auditing on a paused Azure Synapse is not supported. To enable auditing, resume Azure Synapse.
So the last answer is No
upvoted 1 times

  milind8451 1 month ago


Ques mentioned premium tier of SQL not storage. so ans is "Yes".
upvoted 2 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 11/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

  soi 1 month, 4 weeks ago


so what is the final answer ?
upvoted 1 times

  tmurfet 1 month, 4 weeks ago


Azure SQL *servers* only come as Enterprise, Standard, Web, and Developer. There is no standard or premium tier for the server. Only the storage
uses those terms. So last answer is no.
upvoted 2 times

  milind8451 1 month ago


You're wrong, there are 3 tiers- Basic, Standard and Premium.
https://docs.microsoft.com/en-us/azure/azure-sql/database/service-tiers-dtu
upvoted 3 times

  Blimpy 16 hours, 50 minutes ago


nah this link is referring to storage tiers... guys dont confuse yourselves, use the pricing calculator yourself and you will see that only storage
is sold in those 3 tiers. The question refers to the storage that the sql server is using ... and it is not supported
upvoted 1 times

  Blimpy 16 hours, 35 minutes ago


Continued: While the storage for sql server may be premium but the storage for auditing is blob storage which is supported and it is in
the same region as SQLsvr2 ... Y N Y should be correct
upvoted 1 times

  Elecktrus 1 month, 3 weeks ago


The last answer is YES. Because:
- In the question, in the Storage section, are named only 2 kinds of storage (General Purpose v2, y blobstorage, neither are premium)
- Azure SQL have Premium Service Tier (if you use the DTU model): https://azure.microsoft.com/es-es/updates/general-availability-sql-database-
basic-standard-and-premium-service-tiers/
https://docs.microsoft.com/es-es/azure/azure-sql/database/purchasing-models#dtu-based-purchasing-model
So, the question is telling about SQL Premium Tier (more CPU & RAM), not Premium Storage (more disk speed), so in the last question the answer
is YES
upvoted 11 times

  Abhi92 1 month, 2 weeks ago


The tiers mentioned in the question is for the databases.
Some people have assumed that premium storage is required, but nowhere in question mentions premium storage. So the answer would be Yes,
No, Yes
upvoted 8 times

  neverdrop 1 month, 1 week ago


The second anwser is also "yes".
Below quote implies that audit log can go to different regions, with higher cost:
"With database-level auditing, the storage settings for the secondary database will be identical to those of the primary database, causing cross-
regional traffic. We recommend that you enable only server-level auditing, and leave the database-level auditing disabled for all databases."

From: https://docs.microsoft.com/en-us/azure/azure-sql/database/auditing-overview
upvoted 2 times

  jhoomtv 1 month, 1 week ago


Y, N, Y
upvoted 3 times

  kopper2019 1 month, 1 week ago


should be Y, N, Y
upvoted 3 times

  Bhuvy 1 month ago


You need to remember two conditions when it comes to SQL auditing
1.GenV1, GenV2 and blob storages are supported for auditing
2.Storage account location and SQL server location should be in the same region
Hence the correct answer is Yes, No, Yes (close your eyes and select the answer if it comes to exam - Bhuvy)
upvoted 6 times

  Jinder 3 weeks, 5 days ago


Based on the healthy discussion given above, and Microsoft documentation, it's Concluded that the answer is Y-N-Y. Bhuvi, the last one, along with
many others above in discussion, concluded it with an exact logical explanation. Thank you.
upvoted 2 times

  Jinder 3 weeks, 5 days ago


Based on the healthy discussion given above, and Microsoft documentation, it's Concluded that the answer is Y-N-Y. Bhuvi, the last one, along with
many others above in discussion, concluded it with an exact logical explanation. Thank you.
upvoted 3 times

  ashoknakulan 3 weeks, 4 days ago


https://docs.microsoft.com/en-us/azure/azure-sql/database/auditing-overview

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 12/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Auditing limitations
Premium storage is currently not supported.
Hierarchical namespace for Azure Data Lake Storage Gen2 storage account is currently not supported.
Enabling auditing on a paused Azure Synapse is not supported. To enable auditing, resume Azure Synapse.
upvoted 2 times

  malnik 3 weeks, 2 days ago


Y N N, # Premium storage not supported.
upvoted 1 times

  Blaaa 3 weeks, 1 day ago


Yes no yes
upvoted 2 times

  ihustle 3 weeks, 1 day ago


Remarks

Audit logs are written to Append Blobs in an Azure Blob storage on your Azure subscription
Audit logs are in .xel format and can be opened by using SQL Server Management Studio (SSMS).

To configure an immutable log store for the server or database-level audit events, follow the instructions provided by Azure Storage. Make sure
you have selected Allow additional appends when you configure the immutable blob storage.

https://docs.microsoft.com/en-us/azure/azure-sql/database/auditing-overview

This is implies that only blob storage is allowed for SQL audit logs.
upvoted 1 times

  jva 3 weeks, 1 day ago


Not sure why people keep suggesting that premium is referring to the storage account tier. Its a different table where the database is reference.
When it says premium in this question it is not talking about the storage account tier.

Yes no yes
upvoted 3 times

  jatza07 3 weeks ago


Why second answer is "no"?
upvoted 1 times

  Chipper 2 weeks, 1 day ago


Because they are not in the same region.
upvoted 2 times

  AWS56 2 weeks, 1 day ago


YES-NO-NO
Question)What does Premium Pricing tier mean for SQL db3 ?
Answer) SQL db3 uses Premium Blob Storage (General Purpose) which is not supported for Auditing currently and so cannot store audit
information to Storage 2. It's a tricky question, so don't confuse yourself.

Correct Answer is: YES-NO-NO


upvoted 1 times

  hydrillo 2 weeks ago


single SQL databases come in 3 pricing tiers: basic, standard and premium
you can change the pricing tier under configure in every SQL DB
Last answer is definitely YES
upvoted 2 times

  Chipper 2 weeks ago


Are you sure about this? I thought Blob Storage does support auditing and has been for awhile.
upvoted 1 times

  hydrillo 1 week, 4 days ago


It does. That's why it is YES on the last answer.
upvoted 2 times

  glam 1 week, 6 days ago


Box 1: Yes
Be sure that the destination is in the same region as your database and server.
Box 2: No
Box 3: Yes
upvoted 3 times

  VictorVE 1 week, 3 days ago


Yes, no, yes.

"Service tiers
Azure SQL Database offers three service tiers that are designed for different types of applications:

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 13/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

General Purpose/Standard service tier designed for common workloads. It offers budget-oriented balanced compute and storage options.
Business Critical/Premium service tier designed for OLTP applications with high transaction rate and lowest-latency I/O. It offers the highest
resilience to failures by using several isolated replicas.
Hyperscale service tier designed for very large OLTP database and the ability to autoscale storage and scale compute fluidly."

https://docs.microsoft.com/en-us/azure/azure-sql/database/sql-database-paas-overview
upvoted 2 times

  WuWon 1 week, 2 days ago


yes, no, yes
upvoted 2 times

  xaccan 1 week, 1 day ago


yes
no
yes
upvoted 2 times

  vladodias 5 days, 22 hours ago


Can someone point some documentation where it is stated that SQL audit MUST be sent to a storage account in the same region? Cross-regional
traffic is not recommended, but the question is if you CAN do it, not if you should do it...
upvoted 1 times

  sejalo 4 days, 10 hours ago


This is exam question, so need to answer according to question.
I did lab and study all discussions / links - Answer is Y-N-Y.
Premium tier refers to SQL DB in SQL server. The question has already mentioned storage2 is in the same location as SQLsvr2 and also
Blobstorage meaning standard. If you create the storage account with premium, there is no option for blobstorage , option is blockblobstorage.
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 14/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #6 Topic 1

A company has a hybrid ASP.NET Web API application that is based on a software as a service (SaaS) offering.
Users report general issues with the data. You advise the company to implement live monitoring and use ad hoc queries on stored JSON data. You
also advise the company to set up smart alerting to detect anomalies in the data.
You need to recommend a solution to set up smart alerting.
What should you recommend?

A. Azure Site Recovery and Azure Monitor Logs

B. Azure Data Lake Analytics and Azure Monitor Logs

C. Azure Application Insights and Azure Monitor Logs

D. Azure Security Center and Azure Data Lake Store

Correct Answer: B
Application Insights, a feature of Azure Monitor, is an extensible Application Performance Management (APM) service for developers and
DevOps professionals.
Use it to monitor your live applications. It will automatically detect performance anomalies, and includes powerful analytics tools to help you
diagnose issues and to understand what users actually do with your app.
Reference:
https://docs.microsoft.com/en-us/azure/azure-monitor/app/app-insights-overview

  AnilV 2 months ago


It says "adhoc queries on stored JSON data". So the Azure service is Azure Data lake. Answer is B
https://docs.microsoft.com/en-us/azure/architecture/guide/technology-choices/data-store-overview
upvoted 4 times

  sanketshah 1 month ago


C is correct answer.
upvoted 3 times

  AnilV 2 months ago


Typo it is answer D. Azure Security Center and Azure Data Lake Store.
The reason being Analytics over Azure Data Lake and Anomaly detection using using Azure security center
upvoted 2 times

  AnilV 2 months ago


Finally Answer C "Azure Application Insights and Azure Log Analytics". Unable to delete my earlier comments. Sorry for the confusion.
https://docs.microsoft.com/en-us/azure/azure-monitor/app/proactive-diagnostics
upvoted 25 times

  Atilgen 2 months ago


Should be answer C.
Azure Application Insight
upvoted 6 times

  pattasana 2 months ago


it should be C
upvoted 6 times

  NDP 2 months ago


Answer should be
C. Azure Application Insights and Azure Monitor Logs
upvoted 6 times

  arseyam 2 months ago


Answer is C
Collect custom JSON data in Azure Monitor
https://docs.microsoft.com/en-us/azure/azure-monitor/platform/data-sources-json

Application Insights smart detection


https://docs.microsoft.com/en-us/azure/azure-monitor/app/proactive-diagnostics
upvoted 14 times

  eloylb 2 months ago


It is C
upvoted 5 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 15/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

  airairo 2 months ago


According to Whizlabs the answer is C. ( Azure Application Insights & Azure Monitor Logs.

https://docs.microsoft.com/en-us/azure/azure-monitor/app/proactive-diagnostics
upvoted 4 times

  aarna 1 month, 1 week ago


The answer says B, but in the explanation it mentions Application Insights as the answer. So it must be answer C. Looks like they just need to
update their answer.
upvoted 1 times

  Ziegler 1 month, 1 week ago


C is the correct answer as Application Insights can be used for smart alerting to detect anomalies, whilst Azure Monitor Logs can be used for the
live monitoring and use of adhoc queries on stored JSON data
https://docs.microsoft.com/en-us/azure/azure-monitor/app/app-insights-overview
https://docs.microsoft.com/en-us/azure/azure-monitor/platform/data-sources-json
upvoted 5 times

  KirubakaraSen 1 month ago


Support team, Answer shows App insight and AZ monitor which is option "C". But the option selected in "B". Please correct the answer.
upvoted 3 times

  kktpkktp 1 month ago


c is correct
upvoted 3 times

  Mbr99 4 weeks, 1 day ago


Smart alerts and detect anomalies in the data. Hence, answer should be C
upvoted 3 times

  SyntaxError 4 weeks ago


also discussed here: https://www.examtopics.com/discussions/microsoft/view/27786-exam-az-301-topic-17-question-51-discussion/
upvoted 1 times

  rints 3 weeks, 5 days ago


"Users report general issues with the data" Since Data is specifically mentioned should it be Data Lake Analytics instead of Application Insights ? Is
B correct ?
upvoted 1 times

  Hamzo 3 weeks, 3 days ago


https://docs.microsoft.com/en-us/azure/azure-monitor/app/proactive-arm-config
Ans is "C"
upvoted 1 times

  nexnexnex 3 weeks, 3 days ago


answer is correct
we cannot use C since it's SaaS offering
upvoted 1 times

  Blaaa 3 weeks ago


C is answer
upvoted 1 times

  lord_banana 2 weeks, 5 days ago


"A company has a hybrid ASP.NET Web API application that is based on a software as a service (SaaS) offering." It Doesn't say you have an Azure
Web App! You would use Data Lake to ingest the data - Answer is B
upvoted 1 times

  AWS56 2 weeks, 1 day ago


Application Insights, a feature of Azure Monitor, is an extensible Application Performance Management (APM) service for developers and DevOps
professionals.
Answer: C
upvoted 1 times

  Haritosh 2 weeks, 1 day ago


Correct answer is C. They do mention about SaaS but that is for the application in question. The ASP.Net app is based on SaaS offering. While
designing the solution for logging and monitoring, question doesn't ask you to suggest SaaS offering. All other facts and answer explanation show
correct solution - answer C.
upvoted 1 times

  glam 1 week, 6 days ago


C. Azure Application Insights and Azure Monitor Logs
upvoted 2 times

  WuWon 1 week, 2 days ago


C. Azure Application Insights and Azure Monitor Logs

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 16/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

upvoted 1 times

  suryareddy 1 week, 1 day ago


C is correct. Ref: https://docs.microsoft.com/en-us/azure/azure-monitor/app/proactive-failure-diagnostics
upvoted 1 times

  Hi2ALL 1 week ago


I think C is the correct answer
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 17/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #7 Topic 1

You have an Azure subscription that is linked to an Azure Active Directory (Azure AD) tenant. The subscription contains 10 resource groups, one
for each department at your company.
Each department has a speci c spending limit for its Azure resources.
You need to ensure that when a department reaches its spending limit, the compute resources of the department shut down automatically.
Which two features should you include in the solution? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

A. Azure Logic Apps

B. Azure Monitor alerts

C. the spending limit of an Azure account

D. Cost Management budgets

E. Azure Log Analytics alerts

Correct Answer: CD
C: The spending limit in Azure prevents spending over your credit amount. All new customers who sign up for an Azure free account or
subscription types that include credits over multiple months have the spending limit turned on by default. The spending limit is equal to the
amount of credit and it can't be changed.
D: Turn on the spending limit after removing
This feature is available only when the spending limit has been removed inde nitely for subscription types that include credits over multiple
months. You can use this feature to turn on your spending limit automatically at the start of the next billing period.
1. Sign in to the Azure portal as the Account Administrator.
2. Search for Cost Management + Billing.
3. Etc.
Reference:
https://docs.microsoft.com/en-us/azure/cost-management-billing/manage/spending-limit

  19042020 2 months ago


Correct Answers
upvoted 2 times

  MaxBlanche 2 months ago


The answer is wrong. Limits works at the sub level, actually shutting down everything or nothing. I believe it's A+D, where you use budget alerts to
trigger the VMs deallocation with a logic app.
upvoted 9 times

  arseyam 2 months ago


Answer should be B and D
Spending limit on Azure account affects all the subscription not just a specific resource group and it is a feature provided for specific subscription
types like the free one.

The way to achieve the requirements is by doing the following:


1. Azure Monitor > Alerts > Manage Actions > create action groups with action type "Automation Runbook" > stop VM
2. Create budgets scoped by the resource groups > in the alert section choose the previously created action groups.
upvoted 26 times

  sanketshah 1 month ago


B and D are correct answer.
upvoted 1 times

  NickGR 2 months ago


Thats correct arseyam. You need to create a budget for the resource group scope and set an alert with an action group to shutdown the VM.
upvoted 3 times

  Abbas 2 months ago


Answer is A and D. D is already inclusive of C. And question is about "shut the resource down automatically" for which you can design a workflow
in Logic Apps. Other 2 options are for alerts.
upvoted 8 times

  arseyam 2 months ago


Still you need to create an alert with an action group first from Azure Monitor, so the correct answer is B & D.
upvoted 3 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 18/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

  uzairahm007 2 months ago


log apps is trigger based so you would not need to create a alert but how would logic app check budgets and shutdown VM is some one
aware of it so please share
upvoted 1 times

  mmmore 2 months ago


Answers are A + D
An article that describes an example of this setup:
https://www.codit.eu/blog/control-your-azure-costs-through-budget-alerts/
upvoted 5 times

  arseyam 2 months ago


You didn't read "Configure Action Groups" section?!! You can achieve stopping the compute using any service, logic apps, functions or runbooks
but the features you need first to use are, budgets and alerts.
upvoted 3 times

  bjoernhoefer 2 months ago


The correct answer is A + D
(Budget is triggering an Logic App to shut down the machines)
According to the MS-Article: https://docs.microsoft.com/en-us/azure/cost-management-billing/manage/cost-management-budget-scenario
upvoted 16 times

  Blimpy 14 hours, 32 minutes ago


yeah confusing reading all angles of this discussion ... but my final thoughts are with you. A & D (simply because we need to present part of the
SOLUTION and if you have not used a feature that can SHUT DOWN a VM, then the solution is not complete) ... so A: Logic Apps has to be part
of the answer .. it makes 'logical' sense
upvoted 1 times

  arseyam 1 month, 3 weeks ago


You didn't see the part of "Create an Azure Monitor Action Group"? you can shutdown the resources using Logic Apps or Function or
Automation Runbook but YOU must create an action group from Azure Monitor
upvoted 1 times

  Elecktrus 2 months ago


Answer is B & D
According to the MS-Article: https://docs.microsoft.com/en-us/azure/cost-management-billing/manage/cost-management-budget-scenario
You need to have:
1) Azure Budget
2) Azure Monitor to create an alert and actions for this alert (Action Groups)
3) the action group run the App Logic Workflow
The answer can include only 2 actions, so step 1) -> D and step 2) -> B
upvoted 17 times

  Lexa 4 weeks ago


Totally agree
upvoted 1 times

  soi 1 month, 4 weeks ago


true & final answer
upvoted 3 times

  shapgi 1 month, 3 weeks ago


It's A+D only
upvoted 3 times

  gcpjay 1 month, 3 weeks ago


A and D are correct answers.
"you'll set up a scenario where you get a couple of notifications, the first notification is for when 80% of the budget has been reached and the
second notification is when 100% of the budget has been reached. The logic app will be used to shut down all VMs in the resource group. "
https://docs.microsoft.com/en-us/azure/cost-management-billing/manage/cost-management-budget-scenario
upvoted 2 times

  cbj 1 month, 3 weeks ago


Shouldn't it be A + B ? The problem says the departments already have the limit. So we don't need to create budget.
upvoted 1 times

  ArifUK 1 month, 1 week ago


the mention of 'spending limit' is for admin (not-tech) purposes only. the Azure implementation of this is via 'Budget'. So, no.
upvoted 1 times

  shashu07 1 month, 2 weeks ago


C&D
No separate action is required as azure itself disable the services, when spending limit is reached. Provided you should not disabled the spending
limit.

Reaching a spending limit

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 19/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

When your usage results in charges that exhaust your spending limit, the services that you deployed are disabled for the rest of that billing period.
https://docs.microsoft.com/en-us/azure/cost-management-billing/manage/spending-limit
upvoted 5 times

  Mail2KevinD 1 month, 2 weeks ago


What is the final answer?
upvoted 1 times

  Abhi92 1 month, 2 weeks ago


Final Answer is A & D .
Read the official docs they have used Azure Logic App and Cost Management Budget.
https://docs.microsoft.com/en-us/azure/cost-management-billing/manage/cost-management-budget-scenario
upvoted 3 times

  bala321987 1 month, 2 weeks ago


Its correct ! Answer is A & D
upvoted 1 times

  BrokenEdge 1 month, 2 weeks ago


"When you create or edit a budget for a subscription or resource group scope, you can configure it to call an action group. The action group can
perform various actions when your budget threshold is met."
=> Should be B & D, because that is the straight forward approach.

Azure Logic Apps could be used, but any other automation is also fine, therefore I wouldn't choose it as an answer.
upvoted 2 times

  Ziegler 1 month, 1 week ago


Correct answer is A+D based on the following
https://docs.microsoft.com/en-us/azure/cost-management-billing/manage/cost-management-budget-scenario

These actions included in this tutorial allow you to:


1 - Create an Azure Automation Runbook to stop VMs by using webhooks.
2 - Create an Azure Logic App to be triggered based on the budget threshold value and call the runbook with the right parameters.
3 - Create an Azure Monitor Action Group that will be configured to trigger the Azure Logic App when the budget threshold is met.
4 - Create the Azure budget with the wanted thresholds and wire it to the action group.

Items 2 and 4 clearly relates to A and D in the question. Hence my own choice of answer.
upvoted 4 times

  Ziegler 3 weeks, 1 day ago


I am sorry guys. Having gone through the question again, Please ignore my previous answer. It appears that budget has already been
mentioned in the question but we need to augment that with the other two missing steps, which are Alerts and logic Apps. Alert in this case will
serve as an orchestrator. So in view of that I agree with A&B as the correct answer.
upvoted 1 times

  ArifUK 1 month, 1 week ago


I agree with this. Only Action groups can't shut down 'all' compute resources in an RG. the key here is we don't know which and how many
compute are in each RG. So, a Logic app with budget and action group seems to be the way to go.
upvoted 1 times

  itbrpl 1 month, 1 week ago


I would say you need Budget (D) when you create it you also can create the action group.. and this action group can trigger a Logic App (A) that
you may have created before...

Action group is not tied to Monitor.. you can create it from other screens. Then, essentially you need a budget and a logic app to perform the
actions.
upvoted 1 times

  vjsaini 1 month, 1 week ago


C&D options are correct in my opinion, Logic app comes under cost management budget scenarios.

https://docs.microsoft.com/en-us/azure/cost-management-billing/manage/cost-management-budget-scenario
Create an Azure Automation Runbook to stop VMs by using webhooks.
Create an Azure Logic App to be triggered based on the budget threshold value and call the runbook with the right parameters.
Create an Azure Monitor Action Group that will be configured to trigger the Azure Logic App when the budget threshold is met.
Create the Azure budget with the wanted thresholds and wire it to the action group.
upvoted 2 times

  M88778 4 weeks ago


hen your usage results in charges that exhaust your spending limit, the services that you deployed are disabled for the rest of that billing period.
For example, when you spend all the credit included with your Azure free account, Azure resources that you deployed are removed from
production and your Azure virtual machines are stopped and de-allocated. The data in your storage accounts are available as read-only.
upvoted 1 times

  M88778 4 weeks ago


Spending limit feature shuts down the resources, no mention of needing logic apps. The cost management will allow you to add in the limits again
for the next billing month.

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 20/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

I think people have over complicated the request


upvoted 2 times

  Jinder 3 weeks, 5 days ago


The given answer is correct C and D. Definitely options confuse us because all are interconnected to achieve it. I agree with vjsaini and M88778,
their explanation is best to justify the answers.
upvoted 1 times

  nexnexnex 3 weeks, 3 days ago


A. Azure Logic Apps - to consume budget Alerts
B. Azure Monitor alerts - to generate budget Alerts
C. the spending limit of an Azure account - works for the whole account, but we need per-resource group
D. Cost Management budgets - good option, but description states that every department has spending limit, so we assume it's already configured
E. Azure Log Analytics alerts - analytics is not for alerts

A&B then
upvoted 2 times

  leeuw86 3 weeks, 1 day ago


spending limits only work on subscription level but not for resource groups. So i go for B + D
upvoted 1 times

  rizabeer 3 weeks ago


A and D are correct answers, Create a budget (D) which will trigger a script (functions, logic apps etc.)via action groups which will in turn shutdown
VMs. https://docs.microsoft.com/en-us/azure/cost-management-billing/costs/tutorial-acm-create-budgets#trigger-an-action-group
upvoted 1 times

  bbartek 3 weeks ago


For me the answer is A&D, since in Azure Monitor we only need to create an action group, rather than an alert. We then specify this action group
as notification target, when creating a budget in Cost Management. See link that was already provided in the comments:
https://docs.microsoft.com/en-us/azure/cost-management-billing/manage/cost-management-budget-scenario
upvoted 1 times

  HiTTi 2 weeks, 4 days ago


What is the correct answer? A and D
upvoted 1 times

  milind8451 2 weeks, 2 days ago


Ans B and D
https://docs.microsoft.com/en-us/azure/cost-management-billing/manage/cost-management-budget-scenario
upvoted 1 times

  gp777 2 weeks, 1 day ago


A and D:
https://docs.microsoft.com/en-us/azure/cost-management-billing/manage/cost-management-budget-scenario
upvoted 2 times

  AWS56 2 weeks, 1 day ago


Budgets can be set up to trigger a notification when a specified threshold is met. You can provide multiple thresholds to be notified at and the
Logic App will demonstrate the ability for you to perform different actions based on the threshold met. In this example, you'll set up a scenario
where you get a couple of notifications, the first notification is for when 80% of the budget has been reached and the second notification is when
100% of the budget has been reached. The logic app will be used to shut down all VMs in the resource group. First, the Optional threshold will be
reached at 80%, then the second threshold will be reached where all VMs in the subscription will be shut down.

What is Azure Spending limit ? This is the limit Microsoft provides you Example Spending limit on your free account is $100 or $200. You cannot
create when that limit is reached.

Read the question carefully;


Incorrect Options: BCE
Correct Answer: AD
upvoted 3 times

  AWS56 2 weeks, 1 day ago


Refer to this doc: https://docs.microsoft.com/en-us/azure/cost-management-billing/manage/cost-management-budget-scenario
upvoted 2 times

  glam 1 week, 6 days ago


A. Azure Logic Apps
B. Azure Monitor alerts
upvoted 1 times

  AWS56 1 week, 3 days ago


This is incorrect... 100% doesn't make sense
upvoted 1 times

  glam 1 week, 5 days ago

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 21/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

A. Azure Logic Apps


B. Azure Monitor alerts
upvoted 1 times

  xaccan 1 week, 4 days ago


nonsense
upvoted 1 times

  glam 1 week, 1 day ago


please make sense by explaining... @xaccan
upvoted 2 times

  Granwizzard 1 week, 3 days ago


I believe the provided answers are the correct ones.
- Cost management budgets -> You can create thresholds that will send a notification to an action group plus other emails, this can notify the
departments of the costs they are incurring, here you can use tags like department.
- For the spending limit, the services that you deployed are disabled for the rest of that billing period.
upvoted 1 times

  WuWon 1 week, 2 days ago


B and D
upvoted 1 times

  kamps 1 week, 1 day ago


yes - [A. Azure Logic Apps] - you need to power down machines with code from app
yes - [B. Azure Monitor alerts] - with budget set that will create trigger for runbook to execute logic app for powering down machines
not - [C. the spending limit of an Azure account] subscription limit not resource group limit
not sure - [D. Cost Management budgets] budget can be filtered to resource group level but they said budgets are set already per department so
are they agreed or set in azure
not - [E. Azure Log Analytics alerts] - not for alerts - cant trigger anything
upvoted 1 times

  aj 3 days, 12 hours ago


Its A & D, Cost Management can be scoped to resource group and trigger a Logic App to shutdown the resources.
upvoted 2 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 22/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #8 Topic 1

HOTSPOT -
You have an Azure subscription that contains the resources shown in the following table.

You create an Azure SQL database named DB1 that is hosted in the East US region.
To DB1, you add a diagnostic setting named Settings1. Settings1 archives SQLInsights to storage1 and sends SQLInsights to Workspace1.
For each of the following statements, select Yes if the statement is true, Otherwise, select No.
Hot Area:

Correct Answer:

Box 1: No -
You archive logs only to Azure Storage accounts.

Box 2: Yes -

Box 3: Yes -
Sending logs to Event Hubs allows you to stream data to external systems such as third-party SIEMs and other log analytics solutions.
Note: A single diagnostic setting can de ne no more than one of each of the destinations. If you want to send data to more than one of a
particular destination type
(for example, two different Log Analytics workspaces), then create multiple settings. Each resource can have up to 5 diagnostic settings.
Reference:
https://docs.microsoft.com/en-us/azure/azure-monitor/platform/diagnostic-settings

  19042020 2 months ago


Even Storage2 is also a Storage account, so the answers should be Yes, Yes, Yes
upvoted 22 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 23/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

  sanketshah 1 month ago


Answer should be
Yes
Yes
Yes
upvoted 2 times

  AnilV 2 months ago


You can add a new diagnostic settings that archives SQLInsights logs to storage2 should be "YES"
upvoted 5 times

  airairo 2 months ago


the answers are: Yes, Yes, Yes.
you can have multiple settings for diagnostics and stream them onto different locations.

https://docs.microsoft.com/en-us/azure/azure-sql/database/metrics-diagnostic-telemetry-logging-streaming-export-configure?tabs=azure-portal
upvoted 1 times

  cpalacios 1 month, 3 weeks ago


Would you answer Yes, Yes, Yes on the exam if you had this question?
upvoted 5 times

  rsaz300 1 month, 3 weeks ago


Storage Analytics logging is currently available only for the Blob, Queue, and Table services. Storage Analytics logging is also available for
premium-performance BlockBlobStorage accounts. However, it isn't available for general-purpose v2 accounts with premium performance.
upvoted 1 times

  rsaz300 4 weeks ago


Azure Data Lake Storage Gen2 accounts are not currently supported as a destination for diagnostic settings even though they may be listed as a
valid option in the Azure portal.
upvoted 1 times

  LeonLeon 1 month, 3 weeks ago


Can we Archive to StorageV2 or is it only for HOT and COOL?
upvoted 1 times

  Rayrichi 1 month, 2 weeks ago


Yes, Yes, Yes
upvoted 2 times

  shashu07 1 month, 2 weeks ago


ALL YES

Platform logs and metrics can be sent to the destinations Log Analytics workspace, Event hubs, Azure storage account
Only difference Azure storage account is referred for Archiving logs and metrics.

https://docs.microsoft.com/en-us/azure/azure-monitor/platform/diagnostic-settings
upvoted 5 times

  kopper2019 1 month, 1 week ago


all YES, storage2 is v2 but is not premium so logs is possible to be stored
upvoted 1 times

  rints 1 month ago


Yes, I agree. Should be all YES
upvoted 1 times

  KirubakaraSen 1 month ago


First questions should be : NO.
Read the note : Azure Data Lake Storage Gen2 accounts are not currently supported as a destination for diagnostic settings even though they may
be listed as a valid option in the Azure portal.

Link : https://docs.microsoft.com/en-us/azure/azure-monitor/platform/diagnostic-settings
upvoted 3 times

  GabrieleSolieri 1 month ago


Storage v2 can also be with Azure Data Lake flag turned off, I think that the answers are Y-Y-Y
upvoted 1 times

  JohnWick2020 4 days, 3 hours ago


Agreed, all YES
upvoted 1 times

  abhijitxp 4 weeks, 1 day ago


Answers need to be corrected to Yes, Yes, Yes.
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 24/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

  Blaaa 3 weeks ago


All yes
upvoted 1 times

  milind8451 2 weeks, 2 days ago


All Yes
upvoted 1 times

  glam 1 week, 6 days ago


Yes, Yes, Yes
upvoted 1 times

  FK2974 4 days, 6 hours ago


I'v put this question in during my recent training and MS trainer told this should be all Yes/Yes/Yes
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 25/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #9 Topic 1

HOTSPOT -
You deploy several Azure SQL Database instances.
You plan to con gure the Diagnostics settings on the databases as shown in the following exhibit.

Use the drop-down menus to select the answer choice that completes each statement based on the information presented in the graphic.
NOTE: Each correct selection is worth one point.
Hot Area:

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 26/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Correct Answer:

In the exhibit, the SQLInsights data is con gured to be stored in Azure Log Analytics for 90 days. However, the question is asking for the
"maximum" amount of time that the data can be stored which is 730 days.
Design Identity and Security

  senthils 2 months ago


Correct answer is 90 and 730 days
upvoted 19 times

  airairo 2 months ago


where did you get that 90 from?
upvoted 3 times

  DaemonMahara 1 month, 2 weeks ago


Retention in drop down is for storage account so 90 days ( not for log analytics as it confuses most of us)
As log analytics workspace retention is defined seperately default 31 days and max 730 days (hence 730 )
upvoted 1 times

  arseyam 2 months ago


I think he got it from here
https://www.examtopics.com/discussions/microsoft/view/3033-exam-az-301-topic-17-question-6-discussion/
upvoted 1 times

  airairo 2 months ago


it is written 90 sql insight.
upvoted 1 times

  Abbas 2 months ago


Given answers are right. The question is asking about SQLInsight data to be archived in a storage account, which is enabled and no retention
period applies. So indefinite.
upvoted 9 times

  Blimpy 9 hours, 32 minutes ago


No - SQLinsights is a log category which is not shown in the left option of the exhibit .. it is only shown in the pop up. All the unchecked items
in the exhibit is not applicable.. the answer is 90 days and 730 days
upvoted 1 times

  AJ_Galosmo 1 week, 3 days ago


Please read the pop up, it is stated that SQLInsight Data will be stored in storage account for 90 days.
upvoted 2 times

  arseyam 2 months ago


The correct answer is 90 days and indefinite in case you set the retention to 0.
upvoted 1 times

  arseyam 2 months ago


Based on the provided details here
https://www.examtopics.com/discussions/microsoft/view/3033-exam-az-301-topic-17-question-6-discussion/
upvoted 1 times

  airairo 2 months ago


According to Whizlabs, the answers are 90 & indefinite.

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 27/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

https://docs.microsoft.com/en-us/azure/azure-sql/database/metrics-diagnostic-telemetry-logging-streaming-export-configure?tabs=azure-portal
upvoted 1 times

  nesith 1 month, 2 weeks ago


Whizlab answers aren't always correct
upvoted 2 times

  Grudo 1 month ago


is this your first encounter with a whizlabs bot?
upvoted 4 times

  xaccan 1 week, 1 day ago


85% of answers in whizlabs are wrong (Just for your info)
upvoted 1 times

  NickGR 2 months ago


The retention period for SQL Insight is set to 0. The diagnostic retention setting over here is not applicable to the log analytics workspace, but on
the storage account. Thus the amount of time that SQL insights is stored in the Blob is indefinite. The retention for a log analytics workspace is
different. For the log and workspace you have to change the data retention within the workspace setting itself. Thus the MAX is 730. Answers are
correct.
upvoted 6 times

  GenXCoder 1 month, 4 weeks ago


Indefinite and 730 days.
For Retention (days), choose the number of retention days. A retention of zero days stores the logs indefinitely.
https://docs.microsoft.com/en-us/azure/cdn/cdn-azure-diagnostic-logs
upvoted 6 times

  Elecktrus 1 month, 3 weeks ago


GenXCoder, your link is for logs of CDN (Content Delivery), not SQL. Here, you have not SQL Insight
upvoted 2 times

  Elecktrus 1 month, 3 weeks ago


Correct answer is 90 and 730 days. Because:
- SqlInsights is only a kind of SQL data, like Error, Timeouts, blocks, Deadlocks, etc.
They are not showed in the image, but in the upper right side there is a little yellow notification, and this notification says: archive to storage
account is enabled, and SqlInsights log is enabled and have a retention of 90 days. So, the first answer is 90 days
(By the way, if retention for SqlInsight would be 0 days, then it will be indefinide. Retention only apply to storage account).

In the image, we see that is checked the box to send Log Analytics. Log Analytics is a different product, and its not affected for the retention policy
fixed in this screen.
Log Analytics have its own retention time. And the maximum time for retention in Log analytics is 730 days (2 years) and the minimum is 30 days
https://docs.microsoft.com/en-us/azure/azure-monitor/platform/manage-cost-storage#change-the-data-retention-period

So, answers are:


- 90 Days (it's said in the little notification in the upper right side)
-730 Days in Log Analytics, that is the maximum supported for Log Analytics, with a cost of $0.12 per GB / per month
upvoted 8 times

  bingomutant 1 month, 2 weeks ago


the notification refers to LOGS and not DATA
upvoted 1 times

  17vinay 1 month, 3 weeks ago


Indefinite and 730.
upvoted 3 times

  Rayrichi 1 month, 2 weeks ago


Indefinite and 730
Reference:
https://docs.microsoft.com/en-us/azure/azure-monitor/platform/manage-cost-
storage#:~:text=Data%20retention%20can%20be%20configured,pricing%20for%20longer%20data%20retention.
upvoted 2 times

  anandpsg101 1 month, 2 weeks ago


Please refer the top right box which has the details
answer 90 , 730
upvoted 4 times

  nesith 1 month, 2 weeks ago


Answer 90, 730, what you don't see here ins the screen capture is the SQL Insight been turned on and what retention it has been set to, which is
why the top right hand actually gives you the answer, if you are not familiar with diagnostic settings screen you would assume this is a full
capture, by the way none of the other settings are ticket on this screen capture so they are not turned on, hence there retentions doesn't matter
at all
upvoted 2 times

  bingomutant 1 month, 1 week ago


https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 28/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Raw data points (that is, items that you can query in Analytics and inspect in Search) are kept for up to 730 days. You can select a retention
duration of 30, 60, 90, 120, 180, 270, 365, 550 or 730 days. If you need to keep data longer than 730 days, you can use Continuous Export to copy it
to a storage account during data ingestion - MD Docs.
upvoted 1 times

  milind8451 1 month ago


Indefinite and 730 because logs will be stored in blob storage which has indefinite retention and max retention for Log Analytics is 730.
upvoted 1 times

  sammas 3 weeks, 3 days ago


is it 30 and 30 ?.. since subscription is Azure Pass, which has validity of 30 days.
upvoted 1 times

  DatBroNZ 3 weeks, 1 day ago


I just create a SQL Database on Azure and added a Diagnostic Setting for testing.

The 'Retention (days)' configuration only appears if I select the 'Archive to a storage account' option. Then, the following information appears on
the bottom:

'Retention only applies to storage account. Retention policy ranges from 1 to 365 days. If you do not want to apply any retention policy and retain
data forever, set retention (days) to 0.'

The first answer is clearly 'indefinite' then.

For the Log Analytics Workspace, it doesn't give any option for setting the store period. I'd answer 90 days since they added that box on the exhibit
saying "log is enabled and has a retention of 90 days'.
upvoted 2 times

  DatBroNZ 3 weeks, 1 day ago


Answer two could be 730 days:

"Data retention at the workspace level can be configured from 30 to 730 days"

https://docs.microsoft.com/en-us/azure/azure-monitor/platform/manage-cost-storage
upvoted 1 times

  Hamzo 3 weeks, 1 day ago


SQLinsight data and logs are same thing - Value is given for blob that is 90 and LogAnalytics value could be 730 max - So in my opinion answer is
90,730 (however blob could be indefinite but the vaule is already given in screenshot). If there is no screenshot given then answer will be indefinite
and 730. my two cents.
upvoted 1 times

  Blaaa 3 weeks ago


Indefinite and 730
upvoted 2 times

  milind8451 2 weeks, 2 days ago


Right answers.
upvoted 2 times

  glam 1 week, 5 days ago


- 90 Days (it's said in the little notification in the upper right side)
Topic 2 - Question Set 2
- 730 Days in Log Analytics, that is the maximum supported for Log Analytics
upvoted 1 times

  VictorVE 1 week, 3 days ago


The Log Analytics retention settings allow you to configure a minimum of 31 days (if not using a free tier) up to 730 days.

https://docs.microsoft.com/en-us/archive/blogs/canberrapfe/change-oms-log-analytics-retention-period-in-the-azure-portal
upvoted 1 times

  xaccan 1 week, 1 day ago


The final answer is 90 and 730 without doubt
upvoted 1 times

  kaikailiang 5 days, 15 hours ago


90 and 130 days
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 29/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #1 Topic 2

HOTSPOT -
Your company has the divisions shown in the following table.

You plan to deploy a custom application to each subscription. The application will contain the following:
✑ A resource group
✑ An Azure web app
✑ Custom role assignments
✑ An Azure Cosmos DB account
You need to use Azure Blueprints to deploy the application to each subscription.
What is the minimum number of objects required to deploy the application? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:

Correct Answer:

Box 1: 2 -
One management group for East, and one for West.

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 30/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

When creating a blueprint de nition, you'll de ne where the blueprint is saved. Blueprints can be saved to a management group or subscription
that you have
Contributor access to. If the location is a management group, the blueprint is available to assign to any child subscription of that management
group.

Box 2: 1 -
One de nition as the you plan to deploy a custom application to each subscription.
With Azure Blueprints, the relationship between the blueprint de nition (what should be deployed) and the blueprint assignment (what was
deployed) is preserved.

Box 3: 4 -
One assignment for each subscription.
Reference:
https://docs.microsoft.com/en-us/azure/governance/blueprints/overview

  Atilgen 2 months ago


Why not assign the Bleuprint to the management groups?
upvoted 2 times

  mmmore 2 months ago


Unlike Azure Policy, you cannot assign a Blueprint to a management group. You can only assign it to a subscription, as it's purpose is to
configure a subscription. The Blueprint definition, however, can be stored on the management group level.
upvoted 2 times

  MaxBlanche 2 months ago


It should be:
2 Management group (1 root for each tenant)
2 Definitions (1 for each Management Group)
4 Assignments (1 for each sub with the application)
upvoted 44 times

  sanketshah 1 month ago


2,2, 4 are correct answer.
upvoted 2 times

  Abbas 2 months ago


When creating a blueprint definition, you'll define where the blueprint is saved. Blueprints can be saved to a management group or subscription
that you have Contributor access to. If the location is a management group, the blueprint is available to assign to any child subscription of that
management group. Hence, blueprint should be 2.
upvoted 2 times

  salehz 1 month, 4 weeks ago


The management group can't contain more than one tenant, and the Azure blueprint locate in the management group but associated with its child
subscription
upvoted 2 times

  tmurfet 1 month, 3 weeks ago


Technically you are only using one blueprint definition, but in practical terms you are saving a copy to the other tenant, so the answer to the
blueprint definition box = 2.
upvoted 2 times

  she84516 1 month, 3 weeks ago


2 - management groups (single root management group per tenant)
2 - blueprints (different root management groups dicate that blueprints can't be anyhow shared)
2 - assignments (blueprint can be assigned to management group or subscription, we can apply directly to root management group, and it would
automatically apply to related subscriptions, https://docs.microsoft.com/en-us/azure/governance/blueprints/overview#blueprint-assignment)
upvoted 5 times

  olivier_s 1 month, 2 weeks ago


2.2.4 looks correct : "Assigning a blueprint definition to a management group means the assignment object exists at the management group.
The deployment of artifacts still targets a subscription. To perform a management group assignment, the Create Or Update REST API must be
used and the request body must include a value for properties.scope to define the target subscription."
https://docs.microsoft.com/en-us/azure/governance/blueprints/overview#blueprint-assignment
upvoted 1 times

  anandpsg101 1 month, 2 weeks ago


2, 2, 4 is correct
upvoted 2 times

  nesith 1 month, 2 weeks ago

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 31/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

No, correct answer is as provided 2,1,4 you can assign a single blue print to multiple management groups, version of the blue print get save at
management group level, no need to define multiple definitions unless you have differing requirements for each management group
upvoted 1 times

  PrashantGupta1616 1 month, 1 week ago


answer is correct

dropdown 1: 2
One management group for East, and one for West.

dropdown 2: 1
When creating a blueprint definition, you'll define where the blueprint is saved. Blueprints can be saved to a management group or subscription
that you have Contributor access to. If the location is a management group, the blueprint is available to assign to any child subscription of that
management group.
One definition as you plan to deploy a custom application to each subscription.

dropdown 3: 4
With Azure Blueprints, the relationship between the blueprint definition (what should be deployed) and the blueprint assignment (what was
deployed) is preserved.
One assignment for each subscription.
upvoted 3 times

  jhoomtv 1 month, 1 week ago


by your explanation, we should have 2 defination, one for east and one for west. We will create one and we will save twice.
upvoted 1 times

  DragonBlake 1 month, 1 week ago


Answer is correct. Blueprint definition is like a source code. Blueprint definition can be parameterized.
upvoted 3 times

  Bhuvy 1 month ago


Ans is 2,2,2
Management group : Each tenant has one management group, for east 1, for west 1
Definition : Saving the blueprint is nothing but definition, since we have two management group under two different tenant, we need to save it to
two times. (Which is nothing but definition)
Assignment : You can assign the blueprint to multiple subscriptions under a management group, since we have defined the blueprint under
management group, you can select both the subscription under the management group. Since two management group, hence two assignments.)
upvoted 2 times

  bbartek 3 weeks ago


Assigning to a management group, you still need to specify some subscription from this management group as target (see
https://docs.microsoft.com/en-us/azure/governance/blueprints/overview), so 4 assignments are needed.
upvoted 1 times

  milind8451 1 month ago


2,2,4 is correct. You need 2 management groups for 2 Tenants, You need to create 2 definitions as you have 2 management groups so need 1 for
each. 4 Assignments because you have 4 subscription and you need to assign it to all 4 subscriptions.
upvoted 1 times

  shtech 1 month ago


The given answer is correct.
upvoted 1 times

  Blaaa 2 weeks, 6 days ago


2 2 4 is the correct answer
upvoted 2 times

  zengzhen 2 weeks, 5 days ago


Answer is correct.
You need 2 management groups, one for root and another for the subs.
Only 1 definition is needed for the defination locate at the management group.
4 assignments ,each assignment for different sub.
upvoted 2 times

  glam 1 week, 5 days ago


2 Management group (1 root for each tenant)
2 Definitions (1 for each Management Group)
4 Assignments (1 for each sub with the application)
upvoted 2 times

  SamSmith 1 week, 4 days ago


2 2 4 is the correct as you need to create 2 blueprint definitions - one for each directory
upvoted 2 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 32/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 33/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #2 Topic 2

You have an Azure Active Directory (Azure AD) tenant.


You plan to deploy Azure Cosmos DB databases that will use the SQL API.
You need to recommend a solution to provide speci c Azure AD user accounts with read access to the Cosmos DB databases.
What should you include in the recommendation?

A. shared access signatures (SAS) and conditional access policies

B. certi cates and Azure Key Vault

C. a resource token and an Access control (IAM) role assignment

D. master keys and Azure Information Protection policies

Correct Answer: C
The Access control (IAM) pane in the Azure portal is used to con gure role-based access control on Azure Cosmos resources. The roles are
applied to users, groups, service principals, and managed identities in Active Directory. You can use built-in roles or custom roles for individuals
and groups. The following screenshot shows Active Directory integration (RBAC) using access control (IAM) in the Azure portal:

Reference:
https://docs.microsoft.com/en-us/azure/cosmos-db/role-based-access-control

  pattasana 2 months ago


Given answer is correct
upvoted 12 times

  sanketshah 1 month ago


given answer is correct.
upvoted 1 times

  anandpsg101 1 month, 1 week ago


Correct answer
upvoted 2 times

  peacegrace 1 month, 1 week ago


Answer should be correct based on this :
https://docs.microsoft.com/en-us/azure/cosmos-db/secure-access-to-
data#:~:text=Open%20the%20Azure%20portal%2C%20and,user%2C%20group%2C%20or%20application.
upvoted 1 times

  Blaaa 2 weeks, 6 days ago


C is correct
upvoted 1 times

  glam 1 week, 5 days ago


C. a resource token and an Access control (IAM) role assignment
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 34/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #3 Topic 2

HOTSPOT -
You need to design a resource governance solution for an Azure subscription. The solution must meet the following requirements:
✑ Ensure that all ExpressRoute resources are created in a resource group named RG1.
✑ Delegate the creation of the ExpressRoute resources to an Azure Active Directory (Azure AD) group named Networking.
✑ Use the principle of least privilege.
What should you include in the solution? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:

Correct Answer:

Box 1: An Azure policy assignment at the subscription level that has an exclusion
Box 2: A custom RBAC role assignment at the level of RG1
Azure role-based access control (Azure RBAC) is the authorization system you use to manage access to Azure resources. To grant access, you
assign roles to users, groups, service principals, or managed identities at a particular scope.
Reference:
https://docs.microsoft.com/en-us/azure/governance/policy/tutorials/create-and-manage

  arseyam 2 months ago


Correct answer
upvoted 11 times

  milind8451 1 month ago


https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 35/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Why do you need an exclusion? Simply create a policy to allow to creare expressRoute in a particular RG and assign to subscription, what exclusion
is needed here?

upvoted 1 times

  bbartek 3 weeks ago


Policies have allow by default, deny explicitly model. Hence, if you want to accomplish this scenario, you need to explicitly deny creating
ExpressRoute for an entire subscription, with the exclusion for RG1
upvoted 3 times

  openidshanks1 1 month ago


create policy to deny creation of express route and exclude rg1 from the policy, assign this to the subscription
upvoted 7 times

  zeeshankaizer 3 weeks ago


Shouldn't it be RBAC role assignment at subscription level?
upvoted 1 times

  meonyahoo 2 weeks, 2 days ago


Not required as only RG1 is allowed for creating resource
upvoted 1 times

  MikeHugeNerd 2 weeks, 2 days ago


No. At the resource group level ensures the principle of least privilege.
upvoted 3 times

  Blaaa 2 weeks, 6 days ago


Correct answers
upvoted 1 times

  glam 1 week, 5 days ago


Box 1: An Azure policy assignment at the subscription level that has an exclusion
Box 2: A custom RBAC role assignment at the level of RG1
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 36/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #4 Topic 2

You have an Azure Active Directory (Azure AD) tenant and Windows 10 devices.
You con gure a conditional access policy as shown in the exhibit. (Click the Exhibit tab.)

What is the result of the policy?

A. All users will always be prompted for multi-factor authentication (MFA).

B. Users will be prompted for multi-factor authentication (MFA) only when they sign in from devices that are NOT joined to Azure AD.

C. All users will be able to sign in without using multi-factor authentication (MFA).

D. Users will be prompted for multi-factor authentication (MFA) only when they sign in from devices that are joined to Azure AD.

Correct Answer: B
Either the device should be joined to Azure AD or MFA must be used.

  ruckii 2 months ago


The policy is not enabled. It's in off state as I see
upvoted 19 times

  quyennv 2 months ago


yes :)) the result is C
upvoted 6 times

  andyR 2 months ago


correct - answer is C
upvoted 7 times

  sanketshah 1 month ago


Correct answer C
upvoted 2 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 37/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

  TEMPKAKAM 1 month ago


Correct Answer:C
Explanation:
In this scenario, the enable policy is set to Off in the image. So, this policy will have no impact.
https://docs.microsoft.com/en-us/azure/active-directory/conditional-access/overview
upvoted 2 times

  Arminyip 2 months ago


should be D
upvoted 1 times

  guybank 2 months ago


Not really, only one condition needs to apply, if they have an managed device it won’t prompt.
upvoted 1 times

  eloylb 2 months ago


The right answer is C because the policy is Off
upvoted 6 times

  pattasana 2 months ago


B is correct
upvoted 1 times

  nesith 1 month, 2 weeks ago


B is not correct, policy is in an off state, correct answer is C
upvoted 1 times

  Balti 2 months ago


B. The question is about the effect of the policy, the fact that it's disabled isn't relevant. If the question was something like "what experience would
user A have with this configuration" then whether or not it was enabled would be relevant.
upvoted 3 times

  bingomutant 1 month, 2 weeks ago


so enabled would be same answer ?
upvoted 1 times

  nesith 1 month, 2 weeks ago


if policy was enabled answer would B, since one of MFA or Domain Join must be true
upvoted 1 times

  uzairahm007 2 months ago


If the policy is not enabled it would not have any effect so it makes it very relevant
upvoted 1 times

  bingomutant 1 month, 2 weeks ago


agreed -
upvoted 1 times

  Rayrichi 1 month, 3 weeks ago


C is correct. can be also tested easily in Azure.
upvoted 2 times

  Ziegler 1 month, 1 week ago


Correct answer is B based on study of the following
https://docs.microsoft.com/en-us/azure/active-directory/conditional-access/concept-conditional-access-report-only
https://docs.microsoft.com/en-us/azure/active-directory/conditional-access/concept-conditional-access-cloud-apps
https://docs.microsoft.com/en-us/azure/active-directory/conditional-access/concept-conditional-access-grant
upvoted 2 times

  Ideasky 1 month ago


you should check the Conditional Access tab, the answer should be C, since the enable policy is off, if he enable status is ON or report only, it
will take effects, otherwise the Policy is disabled
upvoted 2 times

  azy 1 month, 1 week ago


the answer is correct: https://github.com/MicrosoftDocs/azure-docs/issues/57225
upvoted 1 times

  M88778 4 weeks ago


the real question here is; was it human error to leave the policy as off?
upvoted 1 times

  M88778 4 weeks ago


it asked you about the result of the conditional access. I can understand how people can reach an inference of what the outcome of the policy as a
whole, as being off and nothing will happen. I think it is a poorly written question. I would go B because I don't think they are asking is the policy
on or off, think they are asking you about the behaviour of the conditional access that are ticked
https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 38/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

upvoted 4 times

  rizabeer 2 weeks, 6 days ago


I dont agree @M88778, this is exactly what they are doing, they are testing you for your experience, as soon as you see that the Policy is not
enabled no matter whats configured, there is no policy effect. From here; https://docs.microsoft.com/en-us/azure/active-
directory/authentication/tutorial-enable-azure-mfa
"Conditional Access policies can be set to Report-only if you want to see how the configuration would impact users, or Off if you don't want to the
use policy right now. As a test group of users was targeted for this tutorial, lets enable the policy and then test Azure AD Multi-Factor
Authentication.

Set the Enable policy toggle to On.


To apply the Conditional Access policy, select Create."
I hope this helps, so the answer should be C
upvoted 1 times

  Blaaa 2 weeks, 6 days ago


C is correct
upvoted 1 times

  AWS56 2 weeks ago


Agree, B is the answer
upvoted 1 times

  glam 1 week, 1 day ago


AWS56 1 day, 17 hours ago
This is incorrect... 100% doesn't make sense

Do check the "Enable Policy" status..


upvoted 1 times

  glam 1 week, 5 days ago


C. All users will be able to sign in without using multi-factor authentication (MFA).
[As policy is Off]
upvoted 2 times

  stylocool 4 days, 10 hours ago


Answer is correct.
Report-only mode is a new Conditional Access policy state that allows administrators to evaluate the impact of Conditional Access policies before
enabling them in their environment.
https://docs.microsoft.com/en-us/azure/active-directory/conditional-access/concept-conditional-access-report-only
upvoted 1 times

  aj 3 days, 11 hours ago


B is correct. Answering C is over analysing it, its just a badly worded question they aren't trying to catch you out. THe question asks what is the
effect of the policy, not 'what would happen if the portal UI was in the following state'. Typical, im afraid, of the terrible quality of questions in
these sorts of exams.
upvoted 1 times

  sejalo 8 hours, 54 minutes ago


the question says "What is the result of the policy?" so ans should be C
upvoted 1 times

  FK2974 4 days, 6 hours ago


Yes C is correct!!
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 39/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 40/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #5 Topic 2

You are designing an Azure resource deployment that will use Azure Resource Manager templates. The deployment will use Azure Key Vault to
store secrets.
You need to recommend a solution to meet the following requirements:
✑ Prevent the IT staff that will perform the deployment from retrieving the secrets directly from Key Vault.
✑ Use the principle of least privilege.
Which two actions should you recommend? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

A. Create a Key Vault access policy that allows all get key permissions, get secret permissions, and get certi cate permissions.

B. From Access policies in Key Vault, enable access to the Azure Resource Manager for template deployment.

C. Create a Key Vault access policy that allows all list key permissions, list secret permissions, and list certi cate permissions.

D. Assign the IT staff a custom role that includes the Microsoft.KeyVault/Vaults/Deploy/Action permission.

E. Assign the Key Vault Contributor role to the IT staff.

Correct Answer: BD
B: To access a key vault during template deployment, set enabledForTemplateDeployment on the key vault to true.
D: The user who deploys the template must have the Microsoft.KeyVault/vaults/deploy/action permission for the scope of the resource group
and key vault.
Incorrect Answers:
E: To grant access to a user to manage key vaults, you assign a prede ned key vault Contributor role to the user at a speci c scope.
If a user has Contributor permissions to a key vault management plane, the user can grant themselves access to the data plane by setting a Key
Vault access policy. You should tightly control who has Contributor role access to your key vaults. Ensure that only authorized persons can
access and manage your key vaults, keys, secrets, and certi cates.
Reference:
https://docs.microsoft.com/en-us/azure/azure-resource-manager/templates/key-vault-parameter https://docs.microsoft.com/en-us/azure/key-
vault/general/overview-security

  certmonster 3 months, 1 week ago


The answers are correct.
upvoted 16 times

  sanketshah 1 month ago


given answer is correct.
upvoted 1 times

  Virendrak 3 months, 1 week ago


The answers are correct:
1. On access policy page of azure key vault check the option "Azure Resource Manager for template deployment"

Enable Access to:


Azure Virtual Machines for deployment
Azure Resource Manager for template deployment
Azure Disk Encryption for volume encryption
2. Add a custom role for IT staff
upvoted 4 times

  JustDiscussing 1 month, 3 weeks ago


in exam this week
upvoted 4 times

  alphamode 3 weeks, 3 days ago


The answer is correct and also mentioned clearly at this Azure documentation link:
https://docs.microsoft.com/en-us/azure/azure-resource-manager/templates/key-vault-parameter?tabs=azure-cli
upvoted 1 times

  Blaaa 2 weeks, 6 days ago


Correct answers
upvoted 1 times

  navaro 2 weeks, 3 days ago


Correct answers are B & E. IT staff need to be able to DEPLOY but not have access to the secrets. Thats exactly what KeyVaultContributor role does.

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 41/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

upvoted 1 times

  milind8451 2 weeks, 2 days ago


Answers are correct
upvoted 1 times

  glam 1 week, 5 days ago


B&D

B. From Access policies in Key Vault, enable access to the Azure Resource Manager for template deployment.
D. Assign the IT staff a custom role that includes the Microsoft.KeyVault/Vaults/Deploy/Action permission.
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 42/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #6 Topic 2

You have an Azure subscription that contains resources in three Azure regions.
You need to implement Azure Key Vault to meet the following requirements:
✑ In the event of a regional outage, all keys must be readable.
✑ All the resources in the subscription must be able to access Key Vault.
✑ The number of Key Vault resources to be deployed and managed must be minimized.
How many instances of Key Vault should you implement?

A. 1

B. 2

C. 3

D. 6

Correct Answer: A
The contents of your key vault are replicated within the region and to a secondary region at least 150 miles away but within the same
geography. This maintains high durability of your keys and secrets. See the Azure paired regions document for details on speci c region pairs.
Example: Secrets that must be shared by your application in both Europe West and Europe North. Minimize these as much as you can. Put these
in a key vault in either of the two regions. Use the same URI from both regions. Microsoft will fail over the Key Vault service internally.
Reference:
https://docs.microsoft.com/en-us/azure/key-vault/general/disaster-recovery-guidance

  VikasA 4 months, 3 weeks ago


I think we will need 3 key vaults in 3 regions. During regional outage , azure automatically re-route to secondary region within same geography. As
requested regions are 3 , we cant fulfil this with 1 vault.
upvoted 19 times

  Rickii 4 months, 3 weeks ago


The contents of your key vault are replicated within the region and to a secondary region at least 150 miles away but within the same
geography to maintain high durability of your keys and secrets. So even if you have 1 it gets replicated. So the answer 1 seems correct.
upvoted 6 times

  NickGR 1 month, 2 weeks ago


The answer is correct. The key in the question is the statement that "In the event of a regional outage, all keys must be readable."

From the DR guidance during a failover, your key vault is in read-only mode add is accesible. Thus, is not needed to create extra key vaults in
every region.

https://docs.microsoft.com/en-us/azure/key-vault/general/disaster-recovery-guidance
upvoted 7 times

  nexnexnex 3 weeks, 3 days ago


Key point is 'All the resources in the subscription must be able to access Key Vault'

So despite some resources may use Key Vault in different region, it's a bad practice and will not work for VMs
upvoted 1 times

  Marang73 4 months ago


"You have an Azure subscription that contains resources in three Azure regions" ... so 3 Key Vaults. Each of the regions has failover.
upvoted 1 times

  levianthan 4 months, 2 weeks ago


I would insist on 3 vaults. Most services require the vault in same region.
upvoted 3 times

  ingoo 4 months, 2 weeks ago


Each region needs to have its own key vault with its secrets. I can't deploy on region A using a keyvault from region B, therefore 3
upvoted 7 times

  Remco 4 months, 1 week ago


Difficult one. Access to a key vault is global, but recommended is to create a key vault in the same region as your application.
So, must we choose for technical or best practice ...
upvoted 3 times

  Raj40 4 months ago


C is correct answer. Refer https://docs.microsoft.com/en-us/azure/virtual-machines/windows/disk-encryption-key-vault
https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 43/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

upvoted 10 times

  BoxMan 2 months, 2 weeks ago


This is for Disk Encryption so probably best not applied to this Q due to its wording.
upvoted 1 times

  bbartek 2 weeks, 6 days ago


Also, I just realized, the question says "keys", not "secrets" or "certificates", which can suggest, that it might be actually used for encrypting
purposes.
upvoted 1 times

  bbartek 3 weeks ago


Yeah, but isn't a VM with Disk Encryption enabled still a resource? And the question says "All resources"
upvoted 1 times

  Samu74 4 months ago


Based on your link I agree. Necessary to fulfill the requirement "All the resources in the subscription must be able to access Key Vault"
upvoted 2 times

  Samu74 4 months ago


Specifically the phrase "Your key vault and VMs must be in the same subscription. Also, to ensure that encryption secrets don't cross regional
boundaries, Azure Disk Encryption requires the Key Vault and the VMs to be co-located in the same region"
upvoted 2 times

  ShubhenduDas 3 months, 3 weeks ago


To make sure the co-location policy, there should be 3 keyvaults for three region
upvoted 1 times

  pentum7 3 months, 2 weeks ago


C3

"to ensure that encryption secrets don't cross regional boundaries, Azure Disk Encryption requires the Key Vault and the VMs to be co-located in
the same region. Create and use a Key Vault that is in the same subscription and region as the VMs to be encrypted."
upvoted 1 times

  pentum7 2 months, 3 weeks ago


sorry, 1 is correct

"If individual components within the key vault service fail, alternate components within the region step in to serve your request to make sure
that there is no degradation of functionality. You don't need to take any action to start this process, it happens automatically and will be
transparent to you.

In the rare event that an entire Azure region is unavailable, the requests that you make of Azure Key Vault in that region are automatically
routed (failed over) to a secondary region. When the primary region is available again, requests are routed back (failed back) to the primary
region. Again, you don't need to take any action because this happens automatically."
upvoted 2 times

  BoxMan 2 months, 2 weeks ago


I agree but I'm still not 100%. 1 - as you address Key Vault with URIs which are global i.e.

Global: <vault-name>.vault.azure.net:443

Key Vault names must be Globally unique:

https://docs.microsoft.com/en-us/azure/key-vault/general/about-keys-secrets-certificates#objects-identifiers-and-versioning

"Vault names and Managed HSM pool names are selected by the user and are globally unique."
upvoted 2 times

  passtest100 3 months, 1 week ago


should be 1 enough.
there is paradox in the logic of having 3 key vaults. if one key vault in one region fails, the resources in such region will not access the other key
vaults in the other two regions according to the logic that the key vault and vm should be in the same region. but this does not fulfil the
requirement all the resources still can access the key vault.
upvoted 5 times

  Virendrak 3 months, 1 week ago


The question is not talking anything about VM or disk encryption. It says minimize the cost and all resource should be able to read. On this
scenario 1 would be enough.
upvoted 5 times

  RVR 1 month, 4 weeks ago


"all resource should be able to read" is this met by having just one key vault in a region and there is a regional outage?
upvoted 1 times

  AmarReddy 3 months ago


1
In the rare event that an entire Azure region is unavailable, the requests that you make of Azure Key Vault in that region are automatically routed
https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 44/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

(failed over) to a secondary region. When the primary region is available again, requests are routed back (failed back) to the primary region. Again,
you don't need to take any action because this happens automatically.
upvoted 9 times

  arseyam 2 months, 2 weeks ago


A) is correct. One key vault is enough.
upvoted 5 times

  arseyam 2 months ago


An example of a centralized key vault deployment
https://devblogs.microsoft.com/premier-developer/centralized-vm-certificate-deployment-across-multiple-regions-with-arm-templates/
upvoted 1 times

  cookie66 2 months, 1 week ago


Must have 3 vaults, KV is a region specific service it can't serve request cross region.
upvoted 3 times

  NickGR 2 months ago


Correct, Key Vault is a Region specific service, so 3 key vaults are required.
upvoted 1 times

  beedle 2 months, 1 week ago


it should be 1

https://github.com/uglide/azure-content/blob/master/articles/key-vault/key-vault-disaster-recovery-guidance.md
upvoted 2 times

  Abbas 2 months ago


Answer is 3. The only time you need a separate Key Vault instance is when the resources/entities accessing it are in another location (data
centre/region).
The SLA Microsoft provide is unsurprisingly good: https://docs.microsoft.com/en-gb/azure/key-vault/key-vault-disaster-recovery-guidance. Hence
only one key vault per region should do.
upvoted 2 times

  tmurfet 1 month, 3 weeks ago


quoting from: https://docs.microsoft.com/en-us/azure/key-vault/general/disaster-recovery-guidance ....
"In the rare event that an entire Azure region is unavailable, the requests that you make of Azure Key Vault **in that region** are automatically
routed (failed over) to a secondary region except in the case of the Brazil South region" Noting the three words between asterisks, we see that one
vault will not be enough if three regions are in play and one region fails.
upvoted 1 times

  gcpjay 1 month, 3 weeks ago


Agree. 3 instances for each region.
https://github.com/uglide/azure-content/blob/master/articles/key-vault/key-vault-disaster-recovery-guidance.md
upvoted 2 times

  nesith 1 month, 2 weeks ago


That's a possible solution but not the correct one for the question, it states you must minimize resource to be deployed and managed, there is
no technical requirement why 3 KVs are needed as KVs are global, i.e. resource in x region can access a KV in y region
upvoted 1 times

  LeonLeon 1 month, 3 weeks ago


“ resources to be deployed and managed must be minimized”
Just one with replication. The question is not what’s the best to do!
upvoted 2 times

  DarkChain 1 month, 3 weeks ago


If individual components within the key vault service fail, alternate components within the region step in to serve your request to make sure that
there is no degradation of functionality. You don't need to take any action to start this process, it happens automatically and will be transparent to
you.

In the rare event that an entire Azure region is unavailable, the requests that you make of Azure Key Vault in that region are automatically routed
(failed over) to a secondary region except in the case of the Brazil South region. When the primary region is available again, requests are routed
back (failed back) to the primary region. Again, you don't need to take any action because this happens automatically. _ From Microsoft Article.
Answer is 1.
upvoted 1 times

  shashu07 1 month, 2 weeks ago


Correct Answer: A
As per Article "Azure Key Vault availability and redundancy"
https://docs.microsoft.com/en-us/azure/key-vault/general/disaster-recovery-guidance

The contents of your key vault are replicated within the region and to a secondary region at least 150 miles away
upvoted 4 times

  kktpkktp 1 month ago


1 is correct.

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 45/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

upvoted 1 times

  Jinder 3 weeks, 5 days ago


Answer must be 3, because key-vault can not be accessible across the regions. Three regions so 3 key-vaults. If we have 1 key vault in any of the 3
regions, then how resources belonging to the other 2 regions, not having a local key vault, will access the single provisioned key vault. Having a
backup in the secondary region is a separate thing to support business continuity, which will come into picture when the local key-vault in any
region fails.
upvoted 4 times

  rizabeer 2 weeks, 6 days ago


Again to trump us, you need to deploy Keyvaults for each region, it does not make sense to deploy keyvault in one region for all three to use. Now
the part where it mentions about "read only" so any time any of the three regions go down, the secondary region(backup of keyvault) will be
available in Read only mode. The answer should be 3.
upvoted 2 times

  Blaaa 2 weeks, 6 days ago


C is correct answer
upvoted 2 times

  milind8451 2 weeks, 2 days ago


A) 1 is right ans as the key vault will be replicated to another region as per high availability from Azure. and the failed over region key vault will be
read only.

https://docs.microsoft.com/en-us/azure/key-vault/general/disaster-recovery-guidance
upvoted 1 times

  milind8451 2 weeks, 2 days ago


A) 1 is right ans as the key vault will be replicated to another region as per high availability from Azure. and the failed over region key vault will be
read only. All 3 regions can read this failed over Key vault which answers the ques.

https://docs.microsoft.com/en-us/azure/key-vault/general/disaster-recovery-guidance
upvoted 1 times

  AWS56 2 weeks ago


ANSWER: A-1 (Given answer is correct)

Replicates Key Vault secrets between Azure Key Vaults in the same subscription.
Azure does not yet offer a way to replicate Key Vault secrets between Key Vaults. Although Azure guarantees a certain level of availability for Key
Vault resources, and they offer replication of Key Vaults between paired regions, these protections are insufficient for the following reasons:

Azure will only recover Key Vault resources as part of the recovery of an entire region. For most customers, individual Key Vault resources cannot be
recovered if they are deleted or corrupted.
Key Vaults recovered by Azure as part of a disaster recovery effort are not writable. Key Vault actions are limited to read operations.
Azure's Key Vault replication cannot recover individual secrets
Data protection (backup and restore operations) are not possible with Azure's Key Vault replication
upvoted 1 times

  glam 1 week, 5 days ago


C. 3

Refer https://docs.microsoft.com/en-us/azure/virtual-machines/windows/disk-encryption-key-vault
Warning

Your key vault and VMs must be in the same subscription. Also, to ensure that encryption secrets don't cross regional boundaries, Azure Disk
Encryption requires the Key Vault and the VMs to be co-located in the same region. Create and use a Key Vault that is in the same subscription and
region as the VMs to be encrypted.
upvoted 2 times

  FK2974 4 days, 6 hours ago


A Correct!!
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 46/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #7 Topic 2

You have an Azure Active Directory (Azure AD) tenant.


You plan to provide users with access to shared les by using Azure Storage. The users will be provided with different levels of access to various
Azure le shares based on their user account or their group membership.
You need to recommend which additional Azure services must be used to support the planned deployment.
What should you include in the recommendation?

A. an Azure AD enterprise application

B. Azure Information Protection

C. an Azure AD Domain Services (Azure AD DS) instance

D. an Azure Front Door instance

Correct Answer: C
Azure Files"‰supports identity-based authentication over Server Message Block (SMB) through"‰two types of Domain Services: on-premises
Active Directory Domain
Services (AD DS) and Azure Active Directory Domain Services (Azure AD DS).
Reference:
https://docs.microsoft.com/en-us/azure/storage/ les/storage- les-identity-auth-active-directory-domain-service-enable

  nlr 4 months ago


Given answer is correct
upvoted 16 times

  sanketshah 1 month ago


given answer is correct.
upvoted 1 times

  tmurfet 3 months, 2 weeks ago


The question does not state that the environment is hybrid and connected to any on-premises infrastructure -- so answer A could be correct if it's
an Azure AD only environment. You would need an enterprise SMB client, an Azure VDI Windows client for example.
upvoted 2 times

  certmonster 3 months, 2 weeks ago


I agree on this one.
upvoted 1 times

  tmurfet 3 months, 2 weeks ago


Ignore my previous post -- I made an incorrect assumption regarding Azure AD support for Azure files. It's not supported.
upvoted 1 times

  pentum7 3 months, 2 weeks ago


answer is correct
upvoted 2 times

  pentum7 3 months, 2 weeks ago


AD DS https://docs.microsoft.com/en-us/azure/active-directory-domain-services/overview "You can also use existing groups and user accounts to
secure access to resources."
upvoted 2 times

  JustDiscussing 1 month, 3 weeks ago


in exam this week
upvoted 2 times

  Blaaa 2 weeks, 6 days ago


Correct answers
upvoted 1 times

  Jinder 2 weeks, 4 days ago


Can someone please explain where you get a clue to use Azure AD DS. Question statement never talked about hybrid/on-prem users and groups.
Thanks
upvoted 1 times

  fred00r 1 week, 4 days ago


I think the key here is:
"Azure Files supports identity-based authentication over Server Message Block (SMB) through two types of Domain Services: on-premises Active
https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 47/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Directory Domain Services (AD DS) and Azure Active Directory Domain Services (Azure AD DS)"
- https://docs.microsoft.com/en-us/azure/storage/files/storage-files-identity-auth-active-directory-domain-service-enable?tabs=azure-portal
So AZ DS is set.
upvoted 2 times

  Jinder 14 hours, 8 minutes ago


Thanks. That makes sense.
upvoted 1 times

  glam 1 week, 5 days ago


C. an Azure AD Domain Services (Azure AD DS) instance
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 48/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #8 Topic 2

DRAG DROP -
Your company has users who work remotely from laptops.
You plan to move some of the applications accessed by the remote users to Azure virtual machines. The users will access the applications in
Azure by using a point-to-site VPN connection. You will use certi cates generated from an on-premises-based Certi cation authority (CA).
You need to recommend which certi cates are required for the deployment.
What should you include in the recommendation? To answer, drag the appropriate certi cates to the correct targets. Each certi cate may be used
once, more than once, of not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
Select and Place:

Correct Answer:

  MaxBlanche 2 months ago


The last answer is wrong, the VPN Gateway should have the Root certificate with the public key installed (https://docs.microsoft.com/en-
us/azure/vpn-gateway/vpn-gateway-howto-point-to-site-rm-ps#upload).
upvoted 42 times

  Rayrichi 1 month, 2 weeks ago


agree. Last answer is Root certificate with the public key
upvoted 2 times

  Nian 1 month, 4 weeks ago


I have these:
Box1: A user cert. that has the public key only
Box2: A user cert. that has the private key
Box3: A root CA cert. that has the public key only
upvoted 6 times

  tmurfet 1 month, 3 weeks ago


Agree, last answer is as per MaxBlanche.
upvoted 2 times

  JustDiscussing 1 month, 3 weeks ago


in exam this week

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 49/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

upvoted 1 times

  ZW9 1 month, 2 weeks ago


The same question by 301:
https://www.examtopics.com/discussions/microsoft/view/9289-exam-az-301-topic-6-question-6-discussion/
The answer is different than here.
upvoted 5 times

  Ideasky 2 weeks, 4 days ago


according to the official guide, we need install whole root cert which contains public and private key in client's laptop, we also need to export the
root's public key, then add in the azure vpn gateway configuration, so the questions answer should be
1. Root private key
2. User private key
3. Root public key
https://docs.microsoft.com/en-sg/azure/virtual-wan/certificates-point-to-site
upvoted 3 times

  MadEgg 2 weeks, 2 days ago


I Agree
to 1. On the laptop we need the privat key and we only export the public key.
upvoted 1 times

  milind8451 2 weeks, 2 days ago


Trusted root CA cert on laptop - A Root CA cert that has public key
User's personal store - A user Cert that has private Key
Azure VPN Gateway - A Root CA cert that has public key
upvoted 3 times

  gp777 2 weeks, 1 day ago


Root public
User Private
Root Public
upvoted 6 times

  glam 1 week, 5 days ago


Trusted root CA cert on laptop - A Root CA cert that has public key
User's personal store - A user Cert that has private Key
Azure VPN Gateway - A Root CA cert that has public key
upvoted 3 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 50/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #9 Topic 2

HOTSPOT -
You are building an application that will run in a virtual machine (VM). The application will use Azure Managed Identity.
The application uses Azure Key Vault, Azure SQL Database, and Azure Cosmos DB.
You need to ensure the application can use secure credentials to access these services.
Which authentication method should you recommend? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:

Correct Answer:

Note: Managed identities for Azure resources is the new name for the service formerly known as Managed Service Identity (MSI).
Reference:
https://docs.microsoft.com/en-us/azure/active-directory/managed-identities-azure-resources/overview

  NDP 2 months ago


It should be
Azure Managed Identity
Azure Managed Identity
HAMC
upvoted 3 times

  guybank 2 months ago


https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 51/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Last one should be Azure Managed Identity as well, the question says Authentication not Authorization, authorization for cosmos is via hmac
upvoted 13 times

  sogey 2 months ago


AMI for Cosmos
https://docs.microsoft.com/en-us/azure/active-directory/managed-identities-azure-resources/tutorial-windows-vm-access-cosmos-db
https://docs.microsoft.com/en-us/azure/cosmos-db/database-security
upvoted 1 times

  andyR 2 months ago


AMI x 3
upvoted 8 times

  johndadas 1 month, 3 weeks ago


Key Vault -> RBAC
Azure SQL -> RBAC
Cosmos DB -> HMAC
upvoted 6 times

  ching1118 1 month, 2 weeks ago


This is correct. Refer the 301 question answer. "The question about authorization methods, not what identity to use, is already mentioned in the
question: The application will use Managed Service Identity (MSI)
Key Vault -> RBAC
Azure SQL -> RBAC
Cosmos DB -> HMAC (Azure Cosmos DB uses hash-based message authentication code (HMAC) for authorization)
https://docs.microsoft.com/en-us/azure/cosmos-db/database-security#how-does-azure-cosmos-db-secure-my-database"
upvoted 2 times

  ching1118 1 month, 2 weeks ago


This is incorrect. Sorry for the mistake. It should be AMI for all three
upvoted 1 times

  Rayrichi 1 month, 2 weeks ago


AMI * 3
Reference for Cosmos DB:
https://docs.microsoft.com/en-us/azure/cosmos-db/managed-identity-based-authentication
upvoted 2 times

  Reza666 1 month, 1 week ago


This is a tricky question. It asks the authentication methods which is AMI for all three boxes but there is a similar question in az-301 which asked
about authorisation methods which in that case RBAC, RBAC, HMAC are the correct answers.
upvoted 6 times

  Ziegler 1 month ago


AMI
AMI
AMI
based on https://docs.microsoft.com/en-us/azure/active-directory/managed-identities-azure-resources/overview#what-azure-services-support-
the-feature
upvoted 2 times

  Blaaa 2 weeks, 6 days ago


Ami all 3
upvoted 2 times

  glam 1 week, 5 days ago


AMI
AMI
AMI
upvoted 1 times

  EgorAivazov 1 week, 4 days ago


I don't get it. In the question's body, it's written: "Which AUTHENTICATION method should you recommend?". However, in the image of the answer
area, it's clearly mentioned that the AUTHORIZATION method should be selected...

In case if you need to select the AUTHENTICATION types, then the answer should be
1. AMI
2. AMI
3. AMI

In case if you need to select the AUTHORIZATION type, then the answer should be
1. RBAC
2. RBAC
3. HMAC

So, which one is correct? Your comments are welcome!


upvoted 2 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 52/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

  WuWon 1 week, 2 days ago


AMI, AMI, AMI
upvoted 1 times

  FK2974 4 days, 6 hours ago


AMI is correct for all three!!
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 53/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #10 Topic 2

You have an Azure subscription that contains a custom application named Application1. Application1 was developed by an external company
named Fabrikam,
Ltd. Developers at Fabrikam were assigned role-based access control (RBAC) permissions to the Application1 components. All users are licensed
for the
Microsoft 365 E5 plan.
You need to recommend a solution to verify whether the Fabrikam developers still require permissions to Application1. The solution must meet
the following requirements:
✑ To the manager of the developers, send a monthly email message that lists the access permissions to Application1.
✑ If the manager does not verify an access permission, automatically revoke that permission.
✑ Minimize development effort.
What should you recommend?

A. Create an Azure Automation runbook that runs the Get-AzureADUserAppRoleAssignment cmdlet.

B. Create an Azure Automation runbook that runs the Get-AzureRoleAssignment cmdlet.

C. In Azure Active Directory (Azure AD), create an access review of Application1.

D. In Azure Active Directory (AD) Privileged Identity Management, create a custom role assignment for the Application1 resources.

Correct Answer: C

  Tombarc 5 months, 1 week ago


The correct answer is C. The answer D leads you to believe it's correct as the access view and assignment are created using Privileged Access
Management (PIM), but in the last part, it mentions role assignment, which doesn't make any sense.
upvoted 11 times

  Alexevansigg 3 months, 4 weeks ago


Correct. Heres the Documentation on Access reviews:
https://docs.microsoft.com/en-us/azure/active-directory/governance/manage-user-access-with-access-reviews
upvoted 6 times

  pentum7 3 months, 2 weeks ago


Why are access reviews important?
"Azure AD enables you to collaborate with users from inside your organization and with external users. Users can join groups, invite guests,
connect to cloud apps, and work remotely from their work or personal devices. The convenience of using self-service has led to a need for better
access management capabilities."
upvoted 2 times

  sanketshah 1 month ago


given answer is correct.
upvoted 1 times

  bbartek 3 weeks ago


At first I was thinking it's a tricky one, because of the specified license, but according to MS Identity Governance is supported in this scenario:

Enterprise Mobility + Security E5/A5, Microsoft 365 E5/A5, Microsoft 365 E5/A5 Security, and Azure Active Directory Premium Plan 2 provide the
rights for a user to benefit from Azure Active Directory Identity Governance.

So answer C is correct.
upvoted 1 times

  Blaaa 2 weeks, 6 days ago


Correct answers
upvoted 1 times

  milind8451 2 weeks, 2 days ago


Right ans
upvoted 1 times

  glam 1 week, 5 days ago


C. In Azure Active Directory (Azure AD), create an access review of Application1.
upvoted 1 times

  FK2974 4 days, 6 hours ago


Yes C is correct!!
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 54/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 55/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #11 Topic 2

DRAG DROP -
A company named Contoso, Ltd. has an Azure Active Directory (Azure AD) tenant that uses the Basic license.
You plan to deploy two applications to Azure. The applications have the requirements shown in the following table.

Which authentication strategy should you recommend for each application? To answer, drag the appropriate authentication strategies to the
correct applications.
Each authentication strategy may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view
content.
NOTE: Each correct selection is worth one point.
Select and Place:

Correct Answer:

Box 1: Azure AD V2.0 endpoint -


Microsoft identity platform is an evolution of the Azure Active Directory (Azure AD) developer platform. It allows developers to build
applications that sign in all
Microsoft identities and get tokens to call Microsoft APIs, such as Microsoft Graph, or APIs that developers have built. The Microsoft identity
platform consists of:
OAuth 2.0 and OpenID Connect standard-compliant authentication service that enables developers to authenticate any Microsoft identity,
including:
Work or school accounts (provisioned through Azure AD)
Personal Microsoft accounts (such as Skype, Xbox, and Outlook.com)
Social or local accounts (via Azure AD B2C)

Box 2: Azure AD B2C tenant -


Azure Active Directory B2C provides business-to-customer identity as a service. Your customers use their preferred social, enterprise, or local
account identities to get single sign-on access to your applications and APIs.
Azure Active Directory B2C (Azure AD B2C) integrates directly with Azure Multi-Factor Authentication so that you can add a second layer of
security to sign-up and sign-in experiences in your applications.
Reference:
https://docs.microsoft.com/en-us/azure/active-directory-b2c/active-directory-b2c-reference-mfa https://docs.microsoft.com/en-
us/azure/active-directory/develop/v2-overview

  mmmore 2 months ago


https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 56/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

I believe the answers are the other way around:


Box 1: B2C
Box 2: V2 endpoint
upvoted 27 times

  sanketshah 1 month ago


B2C
V2 Endpoint
upvoted 2 times

  arseyam 2 months ago


Customer Azure AD B2C
Reporting Azure AD v2.0 endpoint
upvoted 3 times

  andyR 2 months ago


I believe customer: B2C tenant, reporting: V2.0 End point
upvoted 2 times

  Elecktrus 2 months ago


Why? Give an explanation about your anwers. B2C is the only that support to use local accounts (that is, Contoso accounts), and manage the
user from Azure AD
upvoted 2 times

  she84516 1 month, 3 weeks ago


Box 1: B2C - personal accounts are obviously supported, also enable B2C for MFA seems possible, AAD v1 or v2 requires premium P1 license
to enable MFA, we have only Basic (https://docs.microsoft.com/en-us/azure/active-directory-b2c/multi-factor-authentication?pivots=b2c-
user-flow)
Box 2: AAD v2.0 - v1.0 does not support personal accounts, v2 - does. At the same time, integrating internal Contoso accounts to B2C would
be more difficult. B2C local accounts can't be named as Contoso accounts, as there would be limited use of these accounts when managing
Azure, or other usually internal services
upvoted 17 times

  suksworld 1 month, 2 weeks ago


I believe the answers should be :
Customer - Azure AD v2.0 endpoint
Reporting - Azure AD v2.0 endpoint
https://www.youtube.com/watch?v=aBMUxC4evhU
https://nicolgit.github.io/AzureAD-Endopoint-V1-vs-V2-comparison/
With a Basic Azure AD license, you can do MFA anyway: https://azure.microsoft.com/en-ca/pricing/details/active-directory/
upvoted 1 times

  tmfahim 1 month, 1 week ago


I think given answer is correct. Azure v2.0 endpoint with Basic license support MFA for all users, but not conditional access polcy. That's why, you
need B2C account to login using personal accounts.
https://docs.microsoft.com/en-ca/azure/active-directory/authentication/concept-mfa-licensing
upvoted 1 times

  fudu101 1 month ago


The v1.0 endpoint allows only work and school accounts to sign in to your application (Azure AD)
The Microsoft identity platform endpoint aka v2 allows work and school accounts from Azure AD and personal Microsoft accounts (MSA), such as
hotmail.com, outlook.com, and msn.com, to sign in. Ref: https://docs.microsoft.com/en-us/azure/active-directory/azuread-dev/azure-ad-endpoint-
comparison#who-can-sign-in
upvoted 2 times

  nexnexnex 3 weeks, 2 days ago


1: B2C or V2
- not V1, we need personal MS
- no requirements for 'local' or 'social' accounts, all users must use MFA, so V2 should be enough
2: V2
- not V1, we need personal MS
- not B2C - complexity with managing AD accounts
upvoted 1 times

  rizabeer 2 weeks, 6 days ago


I dont understand this, can someone please elaborate or point me to good KB so i can gain some understanding?
upvoted 1 times

  Blaaa 2 weeks, 6 days ago


B2C
V2.0
upvoted 1 times

  milind8451 2 weeks, 2 days ago


Customer : B2C
Reporting : V2 endpoint

V1 doesn't supports personal accounts.

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 57/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

upvoted 2 times

  glam 1 week, 5 days ago


Customer Azure AD B2C
Reporting Azure AD v2.0 endpoint
upvoted 1 times

  aj 3 days, 11 hours ago


its correct as stated; Azure AD V2 Endpoint & AD B2C. You cant manage the identities in AAD without B2C in the second box. AD V2 Endpoint is
needed for support of personal accounts (non-AAD).
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 58/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #12 Topic 2

HOTSPOT -
You manage a network that includes an on-premises Active Directory domain and an Azure Active Directory (Azure AD).
Employees are required to use different accounts when using on-premises or cloud resources. You must recommend a solution that lets
employees sign in to all company resources by using a single account. The solution must implement an identity provider.
You need to provide guidance on the different identity providers.
How should you describe each identity provider? To answer, select the appropriate description from each list in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:

Correct Answer:

Box1: User management occurs on-premises. Azure AD authenticates employees by using on-premises passwords.
Azure AD Domain Services for hybrid organizations
Organizations with a hybrid IT infrastructure consume a mix of cloud resources and on-premises resources. Such organizations synchronize
identity information from their on-premises directory to their Azure AD tenant. As hybrid organizations look to migrate more of their on-
premises applications to the cloud, especially legacy directory-aware applications, Azure AD Domain Services can be useful to them.
Example: Litware Corporation has deployed Azure AD Connect, to synchronize identity information from their on-premises directory to their
Azure AD tenant. The identity information that is synchronized includes user accounts, their credential hashes for authentication (password
hash sync) and group memberships.

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 59/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

User accounts, group memberships, and credentials from Litware's on-premises directory are synchronized to Azure AD via Azure AD Connect.
These user accounts, group memberships, and credentials are automatically available within the managed domain.
Box 2: User management occurs on-premises. The on-promises domain controller authenticates employee credentials.
You can federate your on-premises environment with Azure AD and use this federation for authentication and authorization. This sign-in method
ensures that all user authentication occurs on-premises.

Reference:
https://docs.microsoft.com/en-us/azure/active-directory-domain-services/active-directory-ds-overview https://docs.microsoft.com/en-
us/azure/active-directory/hybrid/whatis-fed

  mmmore 2 months ago


Correct
upvoted 13 times

  sanketshah 1 month ago


given answer is correct.
upvoted 1 times

  milind8451 2 weeks, 2 days ago


Right ans.

Sync identity denotes Pass thru authentication or Pass hash sync. Federated idemtity denotes ADFS and in ADFS, authentication happens at on-
prem DCs.
upvoted 1 times

  sudipta0311 2 weeks ago


for Sych , Pass through authentication as well the authentication will be done in on premises AD...hence the authentication is done via on premises
domain controller? bit confised about the answer here
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 60/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

  glam 1 week, 5 days ago


Question #13 Topic 2
Box1: User management occurs on-premises. Azure AD authenticates employees by using on-premises passwords.
Box 2: User management occurs on-premises. The on-promises domain controller authenticates employee credentials.
upvoted 1 times
HOTSPOT -
You con gure the Diagnostics settings for an Azure SQL database as shown in the following exhibit.

Use the drop-down menus to select the answer choice that completes each statement based on the information presented in the graphic.

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 61/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

NOTE: Each correct selection is worth one point.


Hot Area:

Correct Answer:

  mmmore 2 months ago


Correct
upvoted 1 times

  Suseel 1 month, 2 weeks ago


Instead of Azure SQL Analytics Shouldn't it be Azure Log Analytics with SQL Analytics Solution Deployed
upvoted 1 times

  Rayrichi 1 month, 2 weeks ago


correct. See the discussion in : https://www.examtopics.com/exams/microsoft/az-301/view/10/
upvoted 3 times

  sanketshah 1 month ago


given answer is correct.
upvoted 1 times

  buanilk 2 weeks, 6 days ago


I think first step is to clear the checkbox "send to log analytics" and then to choose "stream to event hub".
At once we can only send the diagnostic logs to only one destination. We cannot select 2 destination at once.
upvoted 1 times

  MadEgg 2 weeks, 2 days ago


On the one hand you are right when looking at:
https://docs.microsoft.com/de-de/azure/azure-sql/database/metrics-diagnostic-telemetry-logging-streaming-export-configure?tabs=azure-
portal
It states that you can send them to one destination.
BUT:

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 62/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

When you try it in the portal you can choose both at the same time. Maybe it creates two diagnostics settings or it just works with two
Questiondestinations.
#14 I can't deploy it so if someone could test this that would help - but I'm sure the given answere here is correct. Topic 2
upvoted 1 times

You
 plan
 milind8451
to deploy an application
2 weeks, 2 daysnamed
ago App1 that will run on ve Azure virtual machines. Additional virtual machines will be deployed later to run
Right ans.
App1.
upvoted 1 times
You need to recommend a solution to meet the following requirements for the virtual machines that will run App1:
✑Ensure
 glamthat the virtual
1 week, 5 daysmachines
ago can authenticate to Azure Active Directory (Azure AD) to gain access to an Azure key vault, Azure Logic Apps
box-1 and
instances, stream to event
an Azure SQLhub
database.
box-2 azure sql analytics
✑ Avoid assigning new roles and permissions for Azure services when you deploy additional virtual machines.
upvoted 1 times
✑ Avoid storing secrets and certi cates on the virtual machines.
✑ Minimize administrative effort for managing identities.
Which type of identity should you include in the recommendation?

A. a service principal that is con gured to use a certi cate

B. a system-assigned managed identity

C. a service principal that is con gured to use a client secret

D. a user-assigned managed identity

Correct Answer: D
Managed identities for Azure resources is a feature of Azure Active Directory.
User-assigned managed identity can be shared. The same user-assigned managed identity can be associated with more than one Azure
resource.
Incorrect Answers:
B: System-assigned managed identity cannot be shared. It can only be associated with a single Azure resource.
Reference:
https://docs.microsoft.com/en-us/azure/active-directory/managed-identities-azure-resources/overview

  mmmore 2 months ago


Correct
upvoted 6 times

  anandpsg101 1 month, 1 week ago


correct answer
upvoted 1 times

  Blaaa 2 weeks, 5 days ago


Correct
upvoted 1 times

  milind8451 2 weeks, 2 days ago


Right ans and right description too.
upvoted 1 times

  glam 1 week, 5 days ago


D. a user-assigned managed identity
upvoted 1 times

  FK2974 4 days, 6 hours ago


D is correct that make sense!!
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 63/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #15 Topic 2

You are designing a large Azure environment that will contain many subscriptions.
You plan to use Azure Policy as part of a governance solution.
To which three scopes can you assign Azure Policy de nitions? Each correct answer presents a complete solution.
NOTE: Each correct selection is worth one point.

A. management groups

B. subscriptions

C. Azure Active Directory (Azure AD) tenants

D. resource groups

E. Azure Active Directory (Azure AD) administrative units

F. compute resources

Correct Answer: ABD


Azure Policy evaluates resources in Azure by comparing the properties of those resources to business rules. Once your business rules have
been formed, the policy de nition or initiative is assigned to any scope of resources that Azure supports, such as management groups,
subscriptions, resource groups, or individual resources.
Reference:
https://docs.microsoft.com/en-us/azure/governance/policy/overview

  speedminer 5 months ago


from the link provided:
"Overview
Azure Policy evaluates resources in Azure by comparing the properties of those resources to business rules. These business rules, described in
JSON format, are known as policy definitions. To simplify management, several business rules can be grouped together to form a policy initiative
(sometimes called a policySet).

Once your business rules have been formed, the policy definition or initiative is assigned to any scope of resources that Azure supports, such as

**** management groups, subscriptions, resource groups, or individual resources****.

The assignment applies to all resources within the scope of that assignment. Subscopes can be excluded, if necessary."
upvoted 3 times

  hw121693 4 months, 1 week ago


why compute resource is not counted in individual resources?
upvoted 5 times

  tmurfet 3 months, 3 weeks ago


I think that it rarely makes sense to apply policy to an individual resource. See recommendations: https://docs.microsoft.com/en-
us/azure/governance/policy/overview#recommendations-for-managing-policies
upvoted 2 times

  David_986969 4 months, 1 week ago


Correct answer
upvoted 4 times

  MRBEAST 3 months ago


correct
upvoted 2 times

  mmmore 2 months, 1 week ago


correct
upvoted 2 times

  KumarPV 1 month, 3 weeks ago


Since the ask is for 3 response selections, it is AB&D. As such individual resources too shall apply.
upvoted 1 times

  Blaaa 2 weeks, 5 days ago


Correct answers
upvoted 1 times

  milind8451 2 weeks, 2 days ago

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 64/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

A, B, D are correct as these 3 are the top 3 selections for Azure policies but if we could choose 4 then individual resources can also be selected.
upvoted 1 times

  glam 1 week, 5 days ago


A. management groups
B. subscriptions
D. resource groups
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 65/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #16 Topic 2

You are designing a microservices architecture that will be hosted in an Azure Kubernetes Service (AKS) cluster. Apps that will consume the
microservices will be hosted on Azure virtual machines. The virtual machines and the AKS cluster will reside on the same virtual network.
You need to design a solution to expose the microservices to the consumer apps. The solution must meet the following requirements:
✑ Ingress access to the microservices must be restricted to a single private IP address and protected by using mutual TLS authentication.
✑ The number of incoming microservice calls must be rate-limited.
✑ Costs must be minimized.
What should you include in the solution?

A. Azure App Gateway with Azure Web Application Firewall (WAF)

B. Azure API Management Premium tier with virtual network connection

C. Azure API Management Standard tier with a service endpoint

D. Azure Front Door with Azure Web Application Firewall (WAF)

Correct Answer: B
One option is to deploy APIM (API Management) inside the cluster VNet.
The AKS cluster and the applications that consume the microservices might reside within the same VNet, hence there is no reason to expose
the cluster publicly as all API tra c will remain within the VNet. For these scenarios, you can deploy API Management into the cluster VNet. API
Management Premium tier supports
VNet deployment.
Reference:
https://docs.microsoft.com/en-us/azure/api-management/api-management-kubernetes

  bharatns 2 months ago


Is this should not be A? - Azure App Gateway with Azure Web Application Firewall (WAF)
upvoted 1 times

  arseyam 2 months ago


Azure Application Gateway doesn't have rate limiting capabilities as of now.
upvoted 3 times

  ManoAni 2 months ago


Appgateway doesn't support mutual authentication
upvoted 1 times

  xAlx 2 months ago


Answer is correct:
https://docs.microsoft.com/en-us/azure/api-management/api-management-kubernetes#option-2-install-an-ingress-controller
taking into account TLS support and traffic control
upvoted 8 times

  arseyam 2 months ago


Deploying Azure API Management in a virtual network is only available in the Premium and Developer tiers of API Management.
https://docs.microsoft.com/en-us/azure/api-management/api-management-using-with-vnet
upvoted 5 times

  Ziegler 3 weeks, 3 days ago


B is the correct answer as supported by
https://docs.microsoft.com/en-us/azure/api-management/api-management-kubernetes#option-3-deploy-apim-inside-the-cluster-vnet
upvoted 1 times

  Blaaa 2 weeks, 5 days ago


Correct answers
upvoted 1 times

  milind8451 2 weeks, 1 day ago


Right ans. VNET Integration is supported by only Developer and Premium tier of APIM.
upvoted 1 times

  glam 1 week, 4 days ago


B. Azure API Management Premium tier with virtual network connection
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 66/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 67/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #17 Topic 2

HOTSPOT -
A company plans to implement an HTTP-based API to support a web app. The web app allows customers to check the status of their orders.
The API must meet the following requirements:
✑ Implement Azure Functions.
✑ Provide public read-only operations.
✑ Do not allow write operations.
You need to recommend con guration options.
What should you recommend? To answer, con gure the appropriate options in the dialog box in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:

Correct Answer:

Allowed authentication methods: GET only

Authorization level: Anonymous -


The option is Allow Anonymous requests. This option turns on authentication and authorization in App Service, but defers authorization
decisions to your application code. For authenticated requests, App Service also passes along authentication information in the HTTP headers.
This option provides more exibility in handling anonymous requests.
Reference:
https://docs.microsoft.com/en-us/azure/app-service/overview-authentication-authorization

  bharatns 2 months ago


It seems correct
upvoted 8 times

  airairo 1 month, 3 weeks ago


It came in exam last month.
upvoted 6 times

  Blaaa 2 weeks, 5 days ago


Correct
https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 68/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

upvoted 1 times

  milind8451 2 weeks, 1 day ago


Seems right.
upvoted 1 times

  glam 1 week, 4 days ago


Allowed authentication methods: GET only
Authorization level: Anonymous
upvoted 1 times

Question #18 Topic 2

A company named Contoso Ltd., has a single-domain Active Directory forest named contoso.com.
Contoso is preparing to migrate all workloads to Azure. Contoso wants users to use single sign-on (SSO) when they access cloud-based services
that integrate with Azure Active Directory (Azure AD).
You need to identify any objects in Active Directory that will fail to synchronize to Azure AD due to formatting issues. The solution must minimize
costs.
What should you include in the solution?

A. Azure AD Connect Health

B. Microsoft O ce 365 IdFix

C. Azure Advisor

D. Password Export Server version 3.1 (PES v3.1) in Active Directory Migration Tool (ADMT)

Correct Answer: B

  xisadi3674 2 months ago


why the answer is not A
upvoted 1 times

  sogey 2 months ago


"IdFix is used to perform discovery and remediation of identity objects and their attributes in an on-premises Active Directory environment in
preparation for migration to Azure Active Directory. "
https://github.com/Microsoft/idfix
upvoted 15 times

  shashu07 1 month, 2 weeks ago


IdFix identifies errors such as duplicates and formatting problems in your Active Directory Domain Services (AD DS) domain before you synchronize
to Office 365.
upvoted 3 times

  anandpsg101 1 month, 1 week ago


correct answer
upvoted 3 times

  Blaaa 2 weeks, 5 days ago


Correct
upvoted 1 times

  milind8451 2 weeks, 1 day ago


Correct.
upvoted 1 times

  glam 1 week, 4 days ago


B. Microsoft Office 365 IdFix
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 69/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #19 Topic 2

DRAG DROP -
A company has an existing web application that runs on virtual machines (VMs) in Azure.
You need to ensure that the application is protected from SQL injection attempts and uses a layer-7 load balancer. The solution must minimize
disruption to the code for the existing web application.
What should you recommend? To answer, drag the appropriate values to the correct items. Each value may be used once, more than once, or not
at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
Select and Place:

Correct Answer:

Box 1: Azure Application Gateway


Azure Application Gateway provides an application delivery controller (ADC) as a service. It offers various layer 7 load-balancing capabilities for
your applications.
Box 2: Web Application Firewall (WAF)
Application Gateway web application rewall (WAF) protects web applications from common vulnerabilities and exploits.
This is done through rules that are de ned based on the OWASP core rule sets 3.0 or 2.2.9.
There are rules that detects SQL injection attacks.
Reference:
https://docs.microsoft.com/en-us/azure/application-gateway/application-gateway-faq https://docs.microsoft.com/en-us/azure/application-
gateway/waf-overview

  uzairahm007 1 month, 4 weeks ago


discussion under
https://www.examtopics.com/discussions/microsoft/view/11672-exam-az-301-topic-2-question-25-discussion/
upvoted 1 times

  Elecktrus 1 month, 3 weeks ago


Answer is correct. Because:
1-Azure Application Gateway, Azure Load Balancer y Azure Traffic Manager are all of them Load Balancers. But they are different: Application
Gateway works in OSI layer 7 , Load Balancer works in OSI Layer 4 and Traffic Manager works with DNS. So, the only that verify the requirement
(load-balance at level 7) is Azure Application Gateway

2) Features are Web Application Firewall, SSL-Offloading and url-content routing. Obviusly, the only to protect against SQLInjections is Firewall (ssl-
offloading is to manage https request) so the answer is Web Applicaton Firewall.
https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 70/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

By the way, there is another one Load Balancer, named Azure Front Door, that works in OSI level 7, too.
https://docs.microsoft.com/en-us/azure/architecture/guide/technology-choices/load-balancing-overview
https://docs.microsoft.com/es-es/azure/web-application-firewall/ag/ag-overview
upvoted 15 times

  andyR 1 month, 3 weeks ago


Good to also remember which LB are regional V inter - regional
upvoted 1 times

  airairo 1 month, 3 weeks ago


came in exam last month.
upvoted 6 times

  Blaaa 2 weeks, 5 days ago


Correct
upvoted 1 times

  milind8451 2 weeks, 1 day ago


Correct. Application GW will work as Layer 7 LB and WAF which is part of it, will protect from SQL Injection attack.
upvoted 1 times

  glam 1 week, 4 days ago


Box 1: Azure Application Gateway
Box 2: Web Application Firewall (WAF)
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 71/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #20 Topic 2

You have an Azure subscription. The subscription has a blob container that contains multiple blobs.
Ten users in the nance department of your company plan to access the blobs during the month of April.
You need to recommend a solution to enable access to the blobs during the month of April only.
Which security solution should you include in the recommendation?

A. access keys

B. conditional access policies

C. certi cates

D. shared access signatures (SAS)

Correct Answer: D
Reference:
https://docs.microsoft.com/en-us/azure/storage/common/storage-sas-overview

  Prabakar 2 months ago


This can also the archived with conditional access policies ,Correct me if I am wrong
upvoted 1 times

  Spaceman1 2 months ago


No, it should definitely be SAS. You can limit the data access via start time or expiry time.
upvoted 5 times

  quyennv 2 months ago


your wrong: 'plan to access the blobs during the month of April' SAS allow you to define the duration time
upvoted 8 times

  PrashantGupta1616 1 month, 1 week ago


Exactly, by SAS access duration can be set so after that access will revoke automatically.
upvoted 1 times

  KumarPV 1 month, 3 weeks ago


SAS is the best option
upvoted 5 times

  zic04 1 month, 2 weeks ago


D correct
upvoted 2 times

  Blaaa 2 weeks, 5 days ago


Correct
upvoted 1 times

  milind8451 2 weeks, 1 day ago


Only SAS has capability to grant limited/timely or granular access of a storage account, so the answer is correct.
upvoted 2 times

  glam 1 week, 4 days ago


D. shared access signatures (SAS)
upvoted 1 times

  JohnWick2020 3 days, 2 hours ago


Answer is D. Shared Access Signatures.
This allows for limited-time fine grained access control to resources. So you can generate URL, specify duration (for month of April) and
disseminate URL to 10 team members. On May 1, the SAS token is automatically invalidated, denying team members continued access.
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 72/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #21 Topic 2

HOTSPOT -
You plan to deploy an Azure web app named App1 that will use Azure Active Directory (Azure AD) authentication.
App1 will be accessed from the internet by the users at your company. All the users have computers that run Windows 10 and are joined to Azure
AD.
You need to recommend a solution to ensure that the users can connect to App1 without being prompted for authentication and can access App1
only from company-owned computers.
What should you recommend for each requirement? To answer, select the appropriate options in the answer area.
Hot Area:

Correct Answer:

Box 1: An Azure AD app registration


Azure active directory (AD) provides cloud based directory and identity management services.You can use azure AD to manage users of your
application and authenticate access to your applications using azure active directory.
You register your application with Azure active directory tenant.
Box 2: A conditional access policy
Conditional Access policies at their simplest are if-then statements, if a user wants to access a resource, then they must complete an action.
By using Conditional Access policies, you can apply the right access controls when needed to keep your organization secure and stay out of
your user's way when not needed.

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 73/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Reference:
https://codingcanvas.com/using-azure-active-directory-authentication-in-your-web-application/ https://docs.microsoft.com/en-us/azure/active-
directory/conditional-access/overview

  mmmore 2 months ago


Seems correct.
upvoted 6 times

  milind8451 2 weeks, 1 day ago


Right ans.
upvoted 1 times

  glam 1 week, 4 days ago


Box 1: An Azure AD app registration
Box 2: A conditional access policy
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 74/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #22 Topic 2

HOTSPOT -
You plan to create an Azure environment that will contain a root management group and 10 child management groups. Each child management
group will contain ve Azure subscriptions. You plan to have between 10 and 30 resource groups in each subscription.
You need to design an Azure governance solution. The solution must meet the following requirements:
✑ Use Azure Blueprints to control governance across all the subscriptions and resource groups.
✑ Ensure that Blueprints-based con gurations are consistent across all the subscriptions and resource groups.
✑ Minimize the number of blueprint de nitions and assignments.
What should you include in the solution? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:

Correct Answer:

Box 1: The root management group


When creating a blueprint de nition, you'll de ne where the blueprint is saved. Blueprints can be saved to a management group or subscription
that you have
Contributor access to. If the location is a management group, the blueprint is available to assign to any child subscription of that management
group.
Box 2: The root management group
Each directory is given a single top-level management group called the "Root" management group. This root management group is built into the
hierarchy to have all management groups and subscriptions fold up to it. This root management group allows for global policies and Azure role
assignments to be applied at the directory level.
Each Published Version of a blueprint can be assigned to an existing management group or subscription.
Reference:
https://docs.microsoft.com/en-us/azure/governance/blueprints/overview https://docs.microsoft.com/en-us/azure/governance/management-
groups/overview

  mmmore 2 months ago

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 75/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Seems correct. Although I always though a Blueprint assignment was done on the subscription-level. Reading the docs, it seems to can assign to a
management group using Ann Azure REST API.
upvoted 4 times

  tmurfet 1 month, 3 weeks ago


"Level at which to *create* the blueprint assignments." -- misleading question easily overlooked, as I did at first.
upvoted 2 times

  airairo 1 month, 3 weeks ago


came in exam last month.
upvoted 3 times

  olivier_s 1 month, 2 weeks ago


Actually precising the subscriptionid in the property.scope looks required for management group level assignments:
properties.scope : The target subscription scope of the blueprint assignment (format: '/subscriptions/{subscriptionId}'). For management group
level assignments, the property is required.
ref: https://docs.microsoft.com/en-us/rest/api/blueprints/assignments/createorupdate
upvoted 1 times

  Blaaa 2 weeks, 5 days ago


Correct
upvoted 1 times

  milind8451 2 weeks, 1 day ago


Right Ans. Blueprints can be defined in a management group or subscription. If it is defined in management group, the blueprint is available to
assign to any child subscription of that management group. Since Root management group contains all subsciptions so blueprint shuld be defined
and assigned to Root Mgmt group.
upvoted 1 times

  glam 1 week, 4 days ago


Box 1: The root management group
Box 2: The root management group
upvoted 1 times

  EgorAivazov 1 week, 3 days ago


Disagree with answer #2.

Answer 1: The root management group (correct)


Answer 2: The subscriptions

Explanation: When creating a blueprint definition, you'll define where the blueprint is saved. Blueprints can be saved to a management group or
subscription that you have Contributor access to. If the location is a management group, the blueprint is available to assign to any child
subscription of that management group.

Since question #2 clearly mentions the scope of assignments, it should be on the subscription level.
upvoted 8 times

  tiny ame 1 week, 2 days ago


#1 Management Group #2 Subscription
No blueprints are assigned to management groups. Blueprints are assigned to the subscription.
upvoted 1 times

  MichaelCWWong 1 week ago


Assigning a blueprint definition to a management group means the assignment object exists at the management group. The deployment of
artifacts still targets a subscription.

https://docs.microsoft.com/en-us/azure/governance/blueprints/overview#blueprint-assignment
upvoted 1 times

  EgorAivazov 2 days, 23 hours ago


Correct answer should be:
1. Root management group
2. The subscriptions

Reference: https://docs.microsoft.com/en-us/azure/governance/blueprints/create-blueprint-portal

Assign a blueprint
After a blueprint has been published, it can be assigned to a subscription. Assign the blueprint that you created to one of the subscriptions under
your management group hierarchy. If the blueprint is saved to a subscription, it can only be assigned to that subscription.
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 76/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #23 Topic 2

You have an Azure subscription.


You need to recommend a solution to provide developers with the ability to provision Azure virtual machines. The solution must meet the
following requirements:
✑ Only allow the creation of the virtual machines in speci c regions.
✑ Only allow the creation of speci c sizes of virtual machines.
What should you include in the recommendation?

A. Azure Resource Manager templates

B. Azure Policy

C. conditional access policies

D. role-based access control (RBAC)

Correct Answer: B

  idris 2 months ago


Correct
https://docs.microsoft.com/en-us/azure/cloud-adoption-framework/manage/azure-server-management/common-policies#restrict-vm-size
upvoted 9 times

  xAlx 2 months ago


Correct
upvoted 4 times

  kopper2019 1 month, 1 week ago


correct answer
upvoted 1 times

  Blaaa 2 weeks, 5 days ago


Correct
upvoted 1 times

  milind8451 2 weeks, 1 day ago


Right ans.
upvoted 1 times

  glam 1 week, 4 days ago


B. Azure Policy
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 77/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #24 Topic 2

Your company has the o ces shown in the following table.

The network contains an Active Directory domain named contoso.com that is synced to Azure Active Directory (Azure AD).
All users connect to an Exchange Online.
You need to recommend a solution to ensure that all the users use Azure Multi-Factor Authentication (MFA) to connect to Exchange Online from
one of the o ces.
What should you include in the recommendation?

A. a virtual network and two Microsoft Cloud App Security policies

B. a named location and two Microsoft Cloud App Security policies

C. a conditional access policy and two virtual networks

D. a conditional access policy and two named locations

Correct Answer: D
Conditional Access policies are at their most basic an if-then statement combining signals, to make decisions, and enforce organization
policies. One of those signals that can be incorporated into the decision-making process is network location.
Reference:
https://docs.microsoft.com/en-us/azure/active-directory/conditional-access/location-condition#named-locations

  quyennv 2 months ago


Correct anwser
upvoted 11 times

  airairo 1 month, 3 weeks ago


why not "C. a conditional access policy and two virtual networks"??
upvoted 1 times

  bingomutant 1 month, 3 weeks ago


the office machines are not located on virtual networks - they are on physical NW address
upvoted 3 times

  gisbern 1 month, 3 weeks ago


Users are connecting to EXO from on-premises trusted (named) locations. Their machines are not stored in Azure virtual network to be targeted
by CA policies.
upvoted 1 times

  kopper2019 1 month, 1 week ago


correct answer
upvoted 1 times

  sanketshah 1 month ago


Correct answer
upvoted 1 times

  milind8451 2 weeks, 1 day ago


Right ans. Named location should be defined in Azure AD and then use Conditional access policy.
upvoted 1 times

  alphamode 2 weeks ago


Correct answer
upvoted 1 times

  glam 1 week, 4 days ago


D. a conditional access policy and two named locations
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 78/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #25 Topic 2

HOTSPOT -
Your organization has developed and deployed several Azure App Service Web and API applications. The applications use Azure Key Vault to store
several authentication, storage account, and data encryption keys. Several departments have the following requests to support the applications:

You need to recommend the appropriate Azure service for each department request.
What should you recommend? To answer, con gure the appropriate options in the dialog box in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:

Correct Answer:

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 79/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

  mmmore 2 months ago


Correct
upvoted 13 times

  Rayrichi 1 month, 2 weeks ago


Correct. Discussion under: https://www.examtopics.com/exams/microsoft/az-301/view/6/
upvoted 3 times

  sanketshah 1 month ago


correct answer.
upvoted 1 times

  Lexa 3 weeks, 6 days ago


QA section confused a little, but it's correct: "Provide just-in-time privileged access to Azure AD and Azure resources"
https://docs.microsoft.com/en-us/azure/active-directory/privileged-identity-management/pim-configure#what-does-it-do
upvoted 1 times

  Blaaa 2 weeks, 5 days ago


Correct
upvoted 1 times

  milind8451 2 weeks, 1 day ago


Right ans.
upvoted 1 times

  alphamode 2 weeks ago


100% correct answer
upvoted 1 times

  glam 1 week, 4 days ago


Azure AD Privileged Identity Management
Azure Managed Service identities
Azure AD Privileged Identity Management
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 80/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #26 Topic 2

Your network contains an on-premises Active Directory forest.


You discover that when users change jobs within your company, the membership of the user groups are not being updated. As a result, the users
can access resources that are no longer relevant to their job.
You plan to integrate Active Directory and Azure Active Directory (Azure AD) by using Azure AD Connect.
You need to recommend a solution to ensure that group owners are emailed monthly about the group memberships they manage.
What should you include in the recommendation?

A. Azure AD Identity Protection

B. Azure AD access reviews

C. Tenant Restrictions

D. conditional access policies

Correct Answer: B
Reference:
https://docs.microsoft.com/en-us/azure/active-directory/governance/access-reviews-overview

  gcpjay 1 month, 3 weeks ago


Answer is correct.
upvoted 9 times

  JustDiscussing 1 month, 3 weeks ago


in exam this week
upvoted 4 times

  milind8451 2 weeks, 1 day ago


Right ans.
upvoted 1 times

  alphamode 2 weeks ago


Correct answer
upvoted 1 times

  glam 1 week, 4 days ago


B. Azure AD access reviews
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 81/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #27 Topic 2

HOTSPOT -
You have ve .NET Core applications that run on 10 Azure virtual machines in the same subscription.
You need to recommend a solution to ensure that the applications can authenticate by using the same Azure Active Directory (Azure AD) identity.
The solution must meet the following requirements:
✑ Ensure that the applications can authenticate only when running on the 10 virtual machines.
✑ Minimize administrative effort.
What should you include in the recommendation? To answer, select the appropriate options in the answer area.
Hot Area:

Correct Answer:

Box 1: Create a system-assigned Managed Identities for Azure resource


The managed identities for Azure resources feature in Azure Active Directory (Azure AD) feature provides Azure services with an automatically
managed identity in Azure AD. You can use the identity to authenticate to any service that supports Azure AD authentication, including Key
Vault, without any credentials in your code.
A system-assigned managed identity is enabled directly on an Azure service instance. When the identity is enabled, Azure creates an identity for
the instance in the Azure AD tenant that's trusted by the subscription of the instance. After the identity is created, the credentials are
provisioned onto the instance.
Box 2: An Azure Instance Metadata Service Identity
See step 3 and 5 below.
https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 82/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

How a system-assigned managed identity works with an Azure VM


1. Azure Resource Manager receives a request to enable the system-assigned managed identity on a VM.
2. Azure Resource Manager creates a service principal in Azure AD for the identity of the VM. The service principal is created in the Azure AD
tenant that's trusted by the subscription.
3. Azure Resource Manager con gures the identity on the VM by updating the Azure Instance Metadata Service identity endpoint with the
service principal client
ID and certi cate.
4. After the VM has an identity, use the service principal information to grant the VM access to Azure resources. To call Azure Resource
Manager, use role-based access control (RBAC) in Azure AD to assign the appropriate role to the VM service principal. To call Key Vault, grant
your code access to the speci c secret or key in Key Vault.
5. Your code that's running on the VM can request a token from the Azure Instance Metadata service endpoint, accessible only from within the
VM
Reference:
https://docs.microsoft.com/en-us/azure/active-directory/managed-identities-azure-resources/overview

  andyR 2 months ago


should be user assigned MI
upvoted 25 times

  mmmore 2 months ago


I disagree, it should be System Assigned Managed Identity. The question states that the authentication may only take place when the app runs
on the 10 VMs. In other words, authentication may not happen if the app runs on another VM.

A user-assigigned identity could, in theory, be assigned to VMs, other than the 10 authorised VM. In my opinion, this question is specifically
searching for a solution to make sure that the authentication cannot take place on any other machine. With a system identity, you fulfil this
requirement, not with a user identity.
upvoted 14 times

  andyR 2 months ago


Minimize administrative effort - a system-assigned for each VM x 10 or a single user-assigned that can be shared?
upvoted 2 times

  idris 2 months ago


I agree, as the same identity needs to be shared across resources
https://docs.microsoft.com/en-us/azure/active-directory/managed-identities-azure-resources/overview#managed-identity-types
upvoted 4 times

  MaxBlanche 2 months ago


I agree
upvoted 3 times

  Haykhay 1 month, 4 weeks ago


I think the main point considered is that only 10 VMs should be able allowed to have access, With User MI more VMs can have access while System
MI will only allow the specifc VM's created to have access
upvoted 1 times

  bbartek 3 weeks ago


But 10 separate SMIs, can't be considered as *same* identity.
upvoted 2 times

  tmurfet 1 month, 3 weeks ago


I don't think that the given answer is correct. As mentioned it would mean creating 10 system identities. But also there is the wording: "To
*authenticate*, request a token by using..."
OAuth2 is used for authorization, not authentication. To authenticate you need an ID token from an Azure AD v2.0 endpoint. See:
https://docs.microsoft.com/en-us/azure/active-directory/develop/active-directory-v2-protocols
upvoted 2 times

  tmurfet 1 month, 3 weeks ago


Ignore my previous. Since it is a managed identity, it is already authenticated via IMDS when it requests an access token so given answers are
correct. See steps 5 and 6: https://docs.microsoft.com/en-us/azure/active-directory/managed-identities-azure-resources/how-managed-
identities-work-vm
upvoted 4 times

  SandroAndrade 1 month, 2 weeks ago


User-assigned managed identity > Can be shared
The same user-assigned managed identity can be associated with more than one Azure resource.
https://docs.microsoft.com/en-us/azure/active-directory/managed-identities-azure-resources/overview
upvoted 1 times

  arseyam 1 month, 2 weeks ago


The question says "using the same Azure Active Directory (Azure AD) identity." therefore the answer should be User-Assigned Managed Identity
https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 83/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

upvoted 6 times

  bingomutant 1 month, 1 week ago


agreed - the wording is driving at a shared identity - so its user not system
upvoted 4 times

  milind8451 2 weeks, 1 day ago


1st one should be User assigned MI because it reduces the administrative effort as same key can be used tor all 10 VMs but System Assigned MI
will need 10 keys, 1 for each VM.
upvoted 4 times

  alphamode 2 weeks ago


First part should be user assigned managed identity. @mmmore, In my humble opinion, It is a discretionary decision to use it OR NOT to to use a
user assigned managed identity for non-authorized VMs (other than 10 authorized VMs). Choosing user managed identity is meeting the solution
requirement at least which is not the case with system assigned managed identity as those identities are unique for every system and may not even
fulfill the basic solution criteria of using common id across all 10 VM.
upvoted 1 times

  glam 1 week, 4 days ago


Box 1: Create a user-assigned Managed Identities for Azure resource
Box 2: An Azure Instance Metadata Service Identity
upvoted 2 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 84/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #28 Topic 2

You have an Azure Active Directory (Azure AD) tenant named contoso.com that contains two administrative user accounts named Admin1 and
Admin2.
You create two Azure virtual machines named VM1 and VM2.
You need to ensure that Admin1 and Admin2 are noti ed when more than ve events are added to the security log of VM1 or VM2 during a period
of 120 seconds.
The solution must minimize administrative tasks.
What should you create?

A. two action groups and two alert rules

B. one action group and one alert rule

C. ve action groups and one alert rule

D. two action groups and one alert rule

Correct Answer: B

  mmmore 2 months ago


Seems correct.
upvoted 11 times

  uzairahm007 1 month, 4 weeks ago


would we not need a action rule for each VM?
upvoted 2 times

  Elecktrus 1 month, 4 weeks ago


Not, in the action rule you can define a filter that apply to both VM, for example a regexp with the machine name
upvoted 4 times

  kopper2019 1 month, 1 week ago


correct answer
upvoted 1 times

  MadEgg 2 weeks, 1 day ago


I think we need two alert rules for each VM.
Of course we can have one alert rule which triggers if there are more than five accumulated security events from VM1 and VM2. If doing so it
would trigger if there are two alerts from VM1 and three from VM1 for example - I don't think that this is the requested solution.

We need two alert rules to trigger if there are five security events at VM1 or five security events at VM2 - in my opinion, that's the solution
requested in the question.
upvoted 1 times

  milind8451 2 weeks, 1 day ago


Seems right, you can choose 2 Vms in Scope of 1 Alert rule and then can put condition of security logs and then create an alert group to notify
both admins.
upvoted 1 times

  alphamode 2 weeks ago


Option B is correct. An alert rule can have multiple VMs as target. Also mentioned here at Azure documentation:
https://docs.microsoft.com/en-us/azure/azure-monitor/platform/alerts-overview

The following are key attributes of an alert rule:

Target Resource - Defines the scope and signals available for alerting. A target can be any Azure resource. Example targets:

Virtual machines.
Storage accounts.
Log Analytics workspace.
Application Insights.
For certain resources (like virtual machines), you can specify multiple resources as the target of the alert rule.

Also, as per another Azure documentation - Various alerts may use the same action group or different action groups depending on the user's
requirements.

In Summary, you just need one 1 alert rule and 1 action group and thus option B seems to be correct.
https://docs.microsoft.com/en-us/azure/azure-monitor/platform/action-groups
upvoted 1 times


https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 85/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

 glam 1 week, 4 days ago


B. one action group and one alert rule
upvoted 1 times

Question #29 Topic 2

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that
might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have an Azure Active Directory (Azure AD) tenant named contoso.com. The tenant contains a group named Group1. Group1 contains all the
administrative user accounts.
You discover several login attempts to the Azure portal from countries where administrative users do NOT work.
You need to ensure that all login attempts to the Azure portal from those countries require Azure Multi-Factor Authentication (MFA).
Solution: Create an Access Review for Group1.
Does this solution meet the goal?

A. Yes

B. No

Correct Answer: B
Instead implement Azure AD Privileged Identity Management.
Note: Azure Active Directory (Azure AD) Privileged Identity Management (PIM) is a service that enables you to manage, control, and monitor
access to important resources in your organization.
Reference:
https://docs.microsoft.com/en-us/azure/active-directory/privileged-identity-management/pim-con gure

  KumarPV 1 month, 3 weeks ago


Agreed
upvoted 6 times

  shashu07 1 month, 2 weeks ago


Enable Conditional Access Policy
upvoted 4 times

  kopper2019 1 month, 1 week ago


Enable Conditional Access Policy as well
upvoted 1 times

  airairo 2 weeks, 4 days ago


came in the exam last month.
upvoted 1 times

  glam 1 week, 4 days ago


B. No ..........
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 86/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #30 Topic 2

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that
might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have an Azure Active Directory (Azure AD) tenant named contoso.com. The tenant contains a group named Group1. Group1 contains all the
administrative user accounts.
You discover several login attempts to the Azure portal from countries where administrative users do NOT work.
You need to ensure that all login attempts to the Azure portal from those countries require Azure Multi-Factor Authentication (MFA).
Solution: Implement Azure AD Identity Protection for Group1.
Does this solution meet the goal?

A. Yes

B. No

Correct Answer: B
Implement Azure AD Privileged Identity Management for everyone.
Note: Azure Active Directory (Azure AD) Privileged Identity Management (PIM) is a service that enables you to manage, control, and monitor
access to important resources in your organization.
Reference:
https://docs.microsoft.com/en-us/azure/active-directory/privileged-identity-management/pim-con gure

  idris 2 months ago


I would have said Yes.
PIM is for access rights elevation, whereas Identity Protection is closely hooked to Conditional access for forcing MFA
https://docs.microsoft.com/en-us/azure/active-directory/identity-protection/overview-identity-protection#risk-detection-and-remediation
upvoted 7 times

  mmmore 2 months ago


Agreed
upvoted 1 times

  devianter81 2 months ago


correct. identity protection is not required. it is achieved using conditional access policy with named locations with type 'countries"
upvoted 3 times

  olivier_s 1 month, 2 weeks ago


would say yes : unusual login pattern detection with MFA policy remediation
https://docs.microsoft.com/en-us/azure/active-directory/identity-protection/concept-identity-protection-policies
upvoted 2 times

  jhoomtv 1 month, 1 week ago


but its only for new users?
upvoted 1 times

  arseyam 1 month, 2 weeks ago


Identity Protection and in particular the sign-in risk policy will mainly target this kind of scenarios so I would say yes.
upvoted 3 times

  bingomutant 1 month, 1 week ago


no - read the question - no point imposing MFA on Group 1 - it does not say the login attempts are from Group 1 members - those attempts are
from certain countries - not Group 1 -
upvoted 3 times

  ihustle 1 month, 1 week ago


Answer is Yes.
upvoted 1 times

  Othermike 1 month, 1 week ago


The answer is no , cause you can 't make any rules in Identity protection to require MFA For Azure portal and you can't add the location either . I
think we should use conditional access policy to solve this problem .. I am 100% sure that the answer is no
upvoted 3 times

  esatu 1 week, 6 days ago


MFA Registration Policy in Identity Protection can be used to require MFA. Risky sign-ns allows entering trusted IPs/Locations. You can check it
in the portal. I think the answer is yes.

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 87/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

upvoted 1 times

  megaal 3 weeks, 1 day ago


answer is NO, you need Conditional Access for that
upvoted 2 times

  glam 1 week, 1 day ago


B. No ......
upvoted 1 times

Question #31 Topic 2

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that
might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have an Azure Active Directory (Azure AD) tenant named contoso.com. The tenant contains a group named Group1. Group1 contains all the
administrative user accounts.
You discover several login attempts to the Azure portal from countries where administrative users do NOT work.
You need to ensure that all login attempts to the Azure portal from those countries require Azure Multi-Factor Authentication (MFA).
Solution: You implement an access package.
Does this meet the goal?

A. Yes

B. No

Correct Answer: B
Instead implement Azure AD Privileged Identity Management.
Note: Azure Active Directory (Azure AD) Privileged Identity Management (PIM) is a service that enables you to manage, control, and monitor
access to important resources in your organization.
Reference:
https://docs.microsoft.com/en-us/azure/active-directory/privileged-identity-management/pim-con gure
Design Data Storage

  KumarPV 1 month, 3 weeks ago


Agreed
upvoted 4 times

  glam 1 week, 4 days ago


B. No ............................................
upvoted 1 times

Topic 3 - Question Set 3

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 88/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #1 Topic 3

You have 100 servers that run Windows Server 2012 R2 and host Microsoft SQL Server 2014 instances. The instances host databases that have
the following characteristics:
✑ The largest database is currently 3 TB. None of the databases will ever exceed 4 TB.
✑ Stored procedures are implemented by using CLR.
You plan to move all the data from SQL Server to Azure.
You need to recommend an Azure service to host the databases. The solution must meet the following requirements:
✑ Whenever possible, minimize management overhead for the migrated databases.
✑ Minimize the number of database changes required to facilitate the migration.
✑ Ensure that users can authenticate by using their Active Directory credentials.
What should you include in the recommendation?

A. Azure SQL Database elastic pools

B. Azure SQL Database Managed Instance

C. Azure SQL Database single databases

D. SQL Server 2016 on Azure virtual machines

Correct Answer: B
Reference:
https://docs.microsoft.com/en-us/azure/sql-database/sql-database-managed-instance

  Tombarc 5 months, 1 week ago


B is the correct answer. Azure SQL DB does not support CLR stored procedure:

ref: https://docs.microsoft.com/en-gb/azure/azure-sql/database/transact-sql-tsql-differences-sql-server#transact-sql-syntax-not-supported-in-
azure-sql-database
ref: https://docs.microsoft.com/en-us/azure/azure-sql/managed-instance/transact-sql-tsql-differences-sql-server#clr
upvoted 11 times

  Schandra 3 months ago


B is correct
upvoted 3 times

  pentum7 2 months, 4 weeks ago


Correct:

SQL Managed Instance allows existing SQL Server customers to lift and shift their on-premises applications to the cloud with minimal application
and database changes. At the same time, SQL Managed Instance preserves all PaaS capabilities (automatic patching and version updates,
automated backups, high availability) that drastically reduce management overhead and TCO.
upvoted 9 times

  gcpjay 1 month, 3 weeks ago


Given answer is correct. Azure SQL DB does not support CLR stored procedure only Azure SQL Managed Instance does.
https://docs.microsoft.com/en-us/azure/azure-sql/database/features-comparison.
upvoted 2 times

  JustDiscussing 1 month, 3 weeks ago


in exam this week
upvoted 1 times

  milind8451 2 weeks ago


CLR is support by SQL Managed instance only. Ans is correct.
upvoted 1 times

  glam 1 week, 3 days ago


B. Azure SQL Database Managed Instance
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 89/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #2 Topic 3

You are designing an order processing system in Azure that will contain the Azure resources shown in the following table.

The order processing system will have the following transaction ow:
✑ A customer will place an order by using App1.
✑ When the order is received, App1 will generate a message to check for product availability at vendor 1 and vendor 2.
✑ An integration component will process the message, and then trigger either Function1 or Function2 depending on the type of order.
✑ Once a vendor con rms the product availability, a status message for App1 will be generated by Function1 or Function2.
✑ All the steps of the transaction will be logged to storage1.
Which type of resource should you recommend for the integration component?

A. an Azure Data Factory pipeline

B. an Azure Service Bus queue

C. an Azure Event Grid domain

D. an Azure Event Hubs capture

Correct Answer: A
A data factory can have one or more pipelines. A pipeline is a logical grouping of activities that together perform a task.
The activities in a pipeline de ne actions to perform on your data.
Data Factory has three groupings of activities: data movement activities, data transformation activities, and control activities.
Azure Functions is now integrated with Azure Data Factory, allowing you to run an Azure function as a step in your data factory pipelines.
Reference:
https://docs.microsoft.com/en-us/azure/data-factory/concepts-pipelines-activities

  MaxBlanche 2 months ago


B would be a more appropriate answer
upvoted 21 times

  mmmore 2 months ago


Agreed, I believe this question is looking us to recognise that messages need be able sent from different application components. A service bus
will do this. The actual orchestration via Data Factory doesn't make a lot of sense. Something like Durable (Azure) Functions would be more
suited for that.
upvoted 3 times

  andyR 2 months ago


agreed
upvoted 3 times

  Abbas 2 months ago


Service Bus is a brokered messaging system. It stores messages in a "broker" (for example, a queue) until the consuming party is ready to receive
the messages. So, there could be a delay in getting the response which beats the purpose of this live user interaction session where the polling
could delay the response. Because it will wait for the function to pick the message and both the functions should be picking the same message to
check the availability with two different vendors. and once a message is consumed it is no longer available for others to consume. hence, even grid
is the right solution.
upvoted 3 times

  GenXCoder 1 month, 4 weeks ago


There is no delay with Service Bus unless the receiving service is backed up. The question is about messages, not events -> both event hub and
event grid are out.
upvoted 4 times

  uzairahm007 1 month, 4 weeks ago


look at the definition of Event Grid how would it help in this scenario?
Event Grid is an eventing backplane that enables event-driven, reactive programming. It uses a publish-subscribe model. Publishers emit events,
but have no expectation about which events are handled. Subscribers decide which events they want to handle.
upvoted 1 times

  uzairahm007 1 month, 4 weeks ago

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 90/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Look at the question carefully it states:


An integration component will **process** the message, and then trigger either Function1 or Function2 depending on the type of order.
The keyword is process the message neither Service Bus nor Event Grid provide those functionalities.
There is not mentioned where you would be storing the message but Azure Data Factory can integerate with various platforms and pick messages
and based on that could fire either Function1 or Function2 based on order type.
upvoted 19 times

  arseyam 1 month, 1 week ago


Microsoft Azure Service Bus is a fully managed enterprise message broker with message queues and public-subscribe topics. Service Bus is used
to decouple applications and services from each other, providing the following benefits:
Load-balancing work across competing workers
Safely routing and transferring data and control across service and application boundaries
Coordinating transactional work that requires a high-degree of reliability
In the question
When the order is received, App1 will generate a message to check for product availability at vendor 1 and vendor 2.

With Topics the Publisher sends a message to a topic and one or more subscribers receive a copy of the message, depending on filter rules set
on these subscriptions.

https://docs.microsoft.com/en-us/azure/service-bus-messaging/service-bus-queues-topics-subscriptions#topics-and-subscriptions
upvoted 1 times

  bob2007 1 month, 2 weeks ago


DF is more to copy data https://docs.microsoft.com/en-us/azure/data-factory/control-flow-azure-function-activity SB is for messaging
https://docs.microsoft.com/en-us/azure/azure-functions/functions-bindings-service-bus
upvoted 3 times

  shashu07 1 month, 2 weeks ago


SERVICE BUS
Since vendor is involved, its a distributed platform, ideal for service bus.
Integrated component is Service bus as it can process the message and trigger to functions as well, according to below post
Use the Service Bus trigger to respond to messages from a Service Bus queue or topic. Starting with extension version 3.1.0, you can trigger on a
session-enabled queue or topic
Azure Service Bus trigger for Azure Functions
https://docs.microsoft.com/en-us/azure/azure-functions/functions-bindings-service-bus-trigger?tabs=csharp
upvoted 3 times

  ihustle 1 month, 1 week ago


Service Bus Queue is the right answer
upvoted 2 times

  sanketshah 1 month ago


given answer is correct.
upvoted 1 times

  kirsie 1 month ago


Service bus Queue: https://docs.microsoft.com/pl-pl/azure/event-grid/compare-messaging-services
upvoted 1 times

  SyntaxError 4 weeks, 1 day ago


This also discussed here:
https://www.examtopics.com/discussions/microsoft/view/30305-exam-az-304-topic-3-question-2-discussion/
upvoted 1 times

  bipin24x7 3 weeks, 5 days ago


It is "B - Service Bus". Microsoft Documentation says "Transfer business data, such as sales or purchase orders, journals, or inventory movements."
https://docs.microsoft.com/en-us/azure/service-bus-messaging/service-bus-messaging-overview
upvoted 1 times

  Ziegler 3 weeks, 3 days ago


B is the correct answer

https://docs.microsoft.com/en-us/azure/service-bus-messaging/service-bus-messaging-overview
upvoted 1 times

  nexnexnex 3 weeks, 1 day ago


it should be another Function (or App Service) + Service Bus with 1 incoming queue and optionally pub/sub for F1 anf F2

pretty weird wording in question and answers


upvoted 1 times

  nohaph 3 weeks ago


Correct Answer is B --> Az Service Bus Queue because it is a secure message broker.
Reference --> https://docs.microsoft.com/en-us/azure/architecture/reference-architectures/enterprise-integration/queues-events
upvoted 1 times

  stylocool 2 weeks, 1 day ago

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 91/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

B is correct because service bus can trigger functions


https://docs.microsoft.com/en-us/azure/azure-functions/functions-bindings-service-bus-trigger?tabs=csharp
upvoted 1 times

  milind8451 2 weeks ago


ADF is ETL tool but Service Bus is used for Integration in application. I would choose Service bus as answer.
upvoted 1 times

  Haritosh 1 week, 5 days ago


ASBQ is the correct answer. Data Factory is for entirely different purpose.
upvoted 1 times

  alphamode 1 week, 4 days ago


Option B - Service Bus is closest match
upvoted 1 times

  FK2974 1 week, 4 days ago


A is correct, I had two other exams dumps and both has, actual test and exams boost!
upvoted 1 times

  glam 1 week, 3 days ago


B. an Azure Service Bus queue
upvoted 1 times

  MichaelCWWong 1 week ago


The provided answer (ADF pipeline) could be correct per below example.
https://www.sqlservercentral.com/blogs/communication-in-azure-using-data-factory-to-send-messages-to-azure-service-bus
upvoted 1 times

  JohnWick2020 2 days, 22 hours ago


Answer is A - Azure Data Factory pipeline (via Azure Function Activity)

Translated flow with services:


App1 > {message} > Service Bus > Data Factory > {triggers} > Func1/2 (vendor) ;

Clues:
Transaction flow #2: App1 generates a message ... we can imply (implicitly) this goes into a queue which could be service bus (fifo or topics -
pub/sub model) or storage queues (standard).
Transaction flow #3: ***the crux of question***; we can infer the integration component "processes the message and triggers func1/2". This clearly
rules out service bus queues (holds messages, does not process), event grid (triggers, does not process) and event hubs (stream capture, does not
process). So we are left with Azure Data Factory where we can leverage "Azure Function Activity which allows you to run Azure functions" via a
linked service connection >>>> https://docs.microsoft.com/en-us/azure/data-factory/control-flow-azure-function-activity
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 92/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #3 Topic 3

HOTSPOT -
You have an existing implementation of Microsoft SQL Server Integration Services (SSIS) packages stored in an SSISDB catalog on your on-
premises network.
The on-premises network does not have hybrid connectivity to Azure by using Site-to-Site VPN or ExpressRoute.
You want to migrate the packages to Azure Data Factory.
You need to recommend a solution that facilitates the migration while minimizing changes to the existing packages. The solution must minimize
costs.
What should you recommend? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:

Correct Answer:

Box 1: Azure SQL database -


You can't create the SSISDB Catalog database on Azure SQL Database at this time independently of creating the Azure-SSIS Integration Runtime
in Azure Data
Factory. The Azure-SSIS IR is the runtime environment that runs SSIS packages on Azure.
Box 2: Azure-SQL Server Integration Service Integration Runtime and self-hosted integration runtime
The Integration Runtime (IR) is the compute infrastructure used by Azure Data Factory to provide data integration capabilities across different
network environments. Azure-SSIS Integration Runtime (IR) in Azure Data Factory (ADF) supports running SSIS packages.
Self-hosted integration runtime can be used for data movement in this scenario.
Reference:
https://docs.microsoft.com/en-us/azure/data-factory/create-azure-integration-runtime https://docs.microsoft.com/en-us/sql/integration-
services/lift-shift/ssis-azure-connect-to-catalog-database

  uzairahm007 1 month, 4 weeks ago


Why Self hosted IR?
This article describes how to run SQL Server Integration Services (SSIS) packages on an Azure-SSIS Integration Runtime (Azure-SSIS IR) in Azure
Data Factory with a self-hosted integration runtime (self-hosted IR) configured as a proxy.

With this feature, you can access data on-premises without having to join your Azure-SSIS IR to a virtual network. The feature is useful when your
https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 93/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

corporate network has a configuration too complex or a policy too restrictive for you to inject your Azure-SSIS IR into it.
https://docs.microsoft.com/en-us/azure/data-factory/self-hosted-integration-runtime-proxy-ssis
As Azure cloud does not connectivity to on Premise network you would need to implement self Hosted-IR as well
upvoted 5 times

  jhoomtv 1 month, 1 week ago


because data coming from on-prem, you need self hosted
upvoted 1 times

  Sasi27 3 weeks, 3 days ago


so whats the answer then ?
upvoted 1 times

  heany 3 weeks, 1 day ago


second one should be IR only. as self-hosting is running on on-prem network. but it also mentioned 'The on-premises network does not have
hybrid connectivity to Azure by using Site-to-Site VPN or ExpressRoute'
upvoted 1 times

  rizabeer 2 weeks, 5 days ago


In my opinion the answer is correct, as per your reference @uzairahm007, "This article describes how to run SQL Server Integration Services (SSIS)
packages on an Azure-SSIS Integration Runtime (Azure-SSIS IR) in Azure Data Factory with a self-hosted integration runtime (self-hosted IR)
configured as a proxy.

With this feature, you can access data on-premises without having to join your Azure-SSIS IR to a virtual network. The feature is useful when your
corporate network has a configuration too complex or a policy too restrictive for you to inject your Azure-SSIS IR into it." So both self hosted and
Azure SSIS IR are needed for this feature to work.
upvoted 3 times

  glam 1 week, 3 days ago


Box 1: Azure SQL database -
Box 2: Azure-SQL Server Integration Service Integration Runtime and self-hosted integration runtime
upvoted 3 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 94/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #4 Topic 3

You have 70 TB of les on your on-premises le server.


You need to recommend solution for importing data to Azure. The solution must minimize cost.
What Azure service should you recommend?

A. Azure StorSimple

B. Azure Batch

C. Azure Data Box

D. Azure Stack

Correct Answer: C
Microsoft has engineered an extremely powerful solution that helps customers get their data to the Azure public cloud in a cost-effective,
secure, and e cient manner with powerful Azure and machine learning at play. The solution is called Data Box.
Data Box and is in general availability status. It is a rugged device that allows organizations to have 100 TB of capacity on which to copy their
data and then send it to be transferred to Azure.
Incorrect Answers:
A: StoreSimple would not be able to handle 70 TB of data.
Reference:
https://www.vembu.com/blog/what-is-microsoft-azure-data-box-disk-edge-heavy-gateway-overview/

  speedminer 5 months ago


https://docs.microsoft.com/en-us/azure/databox/data-box-overview
80TB max for Databox
upvoted 6 times

  Hanger_Man 2 months, 1 week ago


correct
upvoted 2 times

  sumedh01 2 months, 1 week ago


Storage capacity 100 TB device has 80 TB usable capacity after RAID 5 protection
upvoted 3 times

  Blaaa 2 weeks, 5 days ago


Correct
upvoted 1 times

  milind8451 2 weeks ago


Right ans.
upvoted 1 times

  glam 1 week, 3 days ago


C. Azure Data Box
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 95/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #5 Topic 3

You have an Azure subscription that contains 100 virtual machines.


You plan to design a data protection strategy to encrypt the virtual disks.
You need to recommend a solution to encrypt the disks by using Azure Disk Encryption. The solution must provide the ability to encrypt operating
system disks and data disks.
What should you include in the recommendation?

A. a certi cate

B. a key

C. a passphrase

D. a secret

Correct Answer: B
For enhanced virtual machine (VM) security and compliance, virtual disks in Azure can be encrypted. Disks are encrypted by using
cryptographic keys that are secured in an Azure Key Vault. You control these cryptographic keys and can audit their use.
Reference:
https://docs.microsoft.com/en-us/azure/virtual-machines/windows/encrypt-disks

  speedminer 5 months ago


Azure Disk Encryption requires an Azure Key Vault to control and manage disk encryption keys and secrets. Your key vault and VMs must reside in
the same Azure region and subscription.
upvoted 12 times

  David_986969 4 months, 1 week ago


Why not a secret that configures the key as the secret?
upvoted 1 times

  biboker766 2 weeks, 5 days ago


with data disk secret may not work
upvoted 1 times

  KakashiHatake 3 months, 4 weeks ago


Eventually, you are configuring a key only. B is correct.
upvoted 11 times

  saditya1 2 months, 2 weeks ago


B is correct
upvoted 9 times

  glam 1 week, 3 days ago


B. a key
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 96/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #6 Topic 3

You have data les in Azure Blob storage.


You plan to transform the les and move them to Azure Data Lake Storage.
You need to transform the data by using mapping data ow.
Which Azure service should you use?

A. Azure Data Box Gateway

B. Azure Storage Sync

C. Azure Data Factory

D. Azure Databricks

Correct Answer: C
You can use Copy Activity in Azure Data Factory to copy data from and to Azure Data Lake Storage Gen2, and use Data Flow to transform data
in Azure Data
Lake Storage Gen2.
Reference:
https://docs.microsoft.com/en-us/azure/data-factory/connector-azure-data-lake-storage

  razvi 5 months ago


C is correct.
https://docs.microsoft.com/en-us/azure/data-factory/concepts-data-flow-
overview#:~:text=Mapping%20data%20flows%20are%20visually%20designed%20data%20transformations,Factory%20pipelines%20that%20use%2
0scaled-out%20Apache%20Spark%20clusters.
upvoted 9 times

  speedminer 5 months ago


For Copy activity, with this connector you can:

Copy data from/to Azure Data Lake Storage Gen2 by using account key, service principal, or managed identities for Azure resources
authentications.
Copy files as-is or parse or generate files with supported file formats and compression codecs.
Preserve file metadata during copy.
Preserve ACLs when copying from Azure Data Lake Storage Gen1/Gen2.
upvoted 1 times

  certmonster 3 months, 2 weeks ago


Mapping data flows are visually designed data transformations in Azure Data Factory. Data flows allow data engineers to develop data
transformation logic without writing code. Data flow activities can be operationalized using existing Azure Data Factory scheduling, control, flow,
and monitoring capabilities.
upvoted 5 times

  kopper2019 1 month, 1 week ago


Data flow = Azure Data Factory
upvoted 3 times

  glam 1 week, 3 days ago


C. Azure Data Factory
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 97/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #7 Topic 3

You have an Azure virtual machine named VM1 that runs Windows Server 2019 and contains 500 GB of data les.
You are designing a solution that will use Azure Data Factory to transform the data les, and then load the les to Azure Data Lake Storage.
What should you deploy on VM1 to support the design?

A. the Azure Pipelines agent

B. the Azure File Sync agent

C. the On-premises data gateway

D. the self-hosted integration runtime in Azure

Correct Answer: D
The integration runtime (IR) is the compute infrastructure that Azure Data Factory uses to provide data-integration capabilities across different
network environments. For details about IR, see Integration runtime overview.
A self-hosted integration runtime can run copy activities between a cloud data store and a data store in a private network. It also can dispatch
transform activities against compute resources in an on-premises network or an Azure virtual network. The installation of a self-hosted
integration runtime needs an on-premises machine or a virtual machine inside a private network.
Reference:
https://docs.microsoft.com/en-us/azure/data-factory/create-self-hosted-integration-runtime

  speedminer 5 months ago


The integration runtime (IR) is the compute infrastructure that Azure Data Factory uses to provide data-integration capabilities across different
network environments. For details about IR, see Integration runtime overview.

A self-hosted integration runtime can run copy activities between a cloud data store and a data store in a private network. It also can dispatch
transform activities against compute resources in an on-premises network or an Azure virtual network. The installation of a self-hosted integration
runtime needs an on-premises machine or a virtual machine inside a private network.

This article describes how you can create and configure a self-hosted IR.
upvoted 6 times

  stricker 4 months, 3 weeks ago


Isn't the question is "What should you deploy on VM1 to support the design" and the answer is talking about deploying self-hosted integration
runtime on Azure.
upvoted 1 times

  levianthan 4 months, 2 weeks ago


The name is a little misleading, the software has to be installed on your local machine.
upvoted 2 times

  pentum7 3 months, 1 week ago


The installation of a self-hosted integration runtime needs an on-premises machine OR a virtual machine inside a private network.
upvoted 2 times

  RandomUser 2 months, 2 weeks ago


Yeah, the SHIR makes sense because it would provide the compute necessary to run the actual transform and copy job. Also, you'd
somehow need to connect to the Windows Server. There is no mention of any file shares etc.

On top of that:
Azure Pipelines - wrong, they are a DevOps concept.
File Sync Agent - wrong, that would make the Windows Server a cache for Azure Files.
On-Premise data gateway - that's the same as our SHIR but for Power BI and Azure SSAS!
upvoted 1 times

  s8y 2 months, 4 weeks ago


so I need to transfer data from azure VM to data lake =all data is already in azure. There is no need for any on-prem component like self hosted
integration runtime D or On-premises data gateway C. Do I read it wrong?
upvoted 3 times

  mbn 1 month, 1 week ago


On-premise data gateway is related to PowerBI, Self-hosted Integration Runtime to Azure Datafactory. See: https://docs.microsoft.com/en-
us/azure/data-factory/create-self-hosted-integration-runtime and https://docs.microsoft.com/en-us/power-bi/connect-data/service-gateway-
onprem
upvoted 3 times

  Phund 2 months, 2 weeks ago


idont undestand too, vm and factory is already in azure, why we need self-host or gateway

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 98/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

upvoted 1 times

  arseyam 2 months, 1 week ago


Read the question properly "You are designing a solution that will use Azure Data Factory to transform the data files, and then load the files
to Azure Data Lake Storage."
upvoted 3 times

  ForYEO 2 months, 2 weeks ago


answer is correct:
https://docs.microsoft.com/en-us/azure/data-factory/connector-file-system

"
If your data store is located inside an on-premises network, an Azure virtual network, or Amazon Virtual Private Cloud, you need to configure a self-
hosted integration runtime to connect to it.
"
upvoted 7 times

  nohaph 3 weeks ago


Answer should be A --> Az pipeline because that will support the move of data and transformation from Az Data factory to Az Data lake storage
While self hosted IR is for --> The integration runtime to be used to connect to the data store. You can use the Azure integration runtime or a self-
hosted integration runtime if your datastore is in a private network. If this property isn't specified, the default Azure integration runtime is used.
reference -->
https://docs.microsoft.com/en-us/azure/data-factory/connector-azure-data-lake-storage
upvoted 1 times

  milind8451 2 weeks ago


Ans is correct, I have tested in lab. The integration runtime (IR) is the setup that Azure Data Factory uses to provide data-integration capabilities
across different data sources. You need to install Integration Runtime on Azure VM and connect it to data factory by the given key. Now ADF will
recognize the data source and you can create pipelines.
upvoted 1 times

  glam 1 week, 3 days ago


D. the self-hosted integration runtime in Azure
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 99/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #8 Topic 3

HOTSPOT -
Your company is designing a multi-tenant application that will use elastic pools and Azure SQL databases. The application will be used by 30
customers.
You need to design a storage solution for the application. The solution must meet the following requirements:
✑ Operational costs must be minimized.
✑ All customers must have their own database.
✑ The customer databases will be in one of the following three Azure regions: East US, North Europe, or South Africa North.
What is the minimum number of elastic pools and Azure SQL Database servers required? To answer, select the appropriate options in the answer
area.
NOTE: Each correct selection is worth one point.
Hot Area:

Correct Answer:

Box 1: 3 -
The server, its pools & databases must be in the same Azure region under the same subscription.

Box 2: 3 -
A server can have up to 5000 databases associated to it.
Reference:
https://vincentlauzon.com/2016/12/18/azure-sql-elastic-pool-overview/

  xAlx 2 months ago

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 100/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Correct, 3 and 3
servers and pools must be in the same regions
upvoted 14 times

  i_am_gopinath 2 weeks, 4 days ago


It says The customer databases will be in one of the following 3 regions, isn't should be 1 and 1?
upvoted 2 times

  uzairahm007 1 month, 4 weeks ago


This is one of those weird MS questions you don't know right now customers db will be in which region and you need to minimize cost than why
will you buy three servers in three pool what if all customers db is in single pool :D
upvoted 2 times

  tmurfet 1 month, 3 weeks ago


They state that customers are in all three regions, so having one single pool would mean that customers in two regions would have inferior
performance.
upvoted 2 times

  Bullionaire 2 weeks, 2 days ago


I disagree, what it actually says is " The customer databases will be in one of the following three Azure regions: East US, North Europe, or
South Africa North."

So the answer is: 1 and 1


upvoted 2 times

  sejalo 2 weeks, 4 days ago


Where is it stated all three regions? The question says "The customer databases will be in one of the following 3 regions, isn't should be 1
and 1?"
upvoted 1 times

  nesith 1 month, 2 weeks ago


Answer should be 1 and 3, key statements, 1) COST must be MINIMISED, 2) customer databases will be in ONE of the following regions ..... so key
for an Elastic pool is to have databases in the same resource group, this is enforced by the db's server resource group, so you can have a db server
in any one of the regions with 3 DBs in that server and have them pooled in a single elastic pool, as per given question this is the correct answer
upvoted 2 times

  kopper2019 1 month, 1 week ago


so if that the case why not 1 and 1?
upvoted 2 times

  jhoomtv 1 month, 1 week ago


The customer databases will be in one of the following 3 regions: R1, R2, R3
NOTE: "one" of the following, not in all 3 - Answer is 1, 1
upvoted 13 times

  i_am_gopinath 2 weeks, 4 days ago


It says The customer databases will be in one of the following 3 regions, isn't should be 1 and 1?
upvoted 2 times

  umby 2 weeks, 3 days ago


Each of the 30 customers should have their own database....so answer should be 3 and 30
upvoted 1 times

  Voyagerjn 2 weeks, 2 days ago


Good article.
https://docs.microsoft.com/en-us/azure/azure-sql/database/saas-tenancy-app-design-patterns
upvoted 1 times

  MadEgg 2 weeks, 1 day ago


The question is just not perfectly clear:
"The customer databases will be in one of the following 3 regions, ..."
- one can understand we can choose where they should be installed
-> the answer would be 1, 1
- one can understand that there will be databases in the three regions (at least one DB)
-> the answer would be 3, 3

I think that they meant the second one so the answer is right.
upvoted 2 times

  milind8451 1 week, 5 days ago


3, 3 is the right ans because you need to have 3 azure SQL servers to hold databases in 3 regions, you can not add databases from another region
into a sql server. Thus you need 3 elastic pools as well coz elastic pool will work on a single server Dbs you can spread it to many servers.
upvoted 3 times

  milind8451 5 days, 11 hours ago


Correction in my last line: *you can NOT spread it to many servers.
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 101/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

  glam 1 week, 3 days ago


Box 1: 3 -
The server, its pools & databases must be in the same Azure region under the same subscription.
Box 2: 3 -
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 102/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #9 Topic 3

You are reviewing an Azure architecture as shown in the Architecture exhibit. (Click the Architecture tab.)

The estimated monthly costs for the architecture are shown in the Costs exhibit. (Click the Costs tab.)

The log les are generated by user activity to Apache web servers. The log les are in a consistent format. Approximately 1 GB of logs are
generated per day.
Microsoft Power BI is used to display weekly reports of the user activity.
You need to recommend a solution to minimize costs while maintaining the functionality of the architecture.
What should you recommend?

A. Replace Azure Synapse Analytics and Azure Analysis Services with SQL Server on an Azure virtual machine.

B. Replace Azure Synapse Analytics with Azure SQL Database Hyperscale.

C. Replace Azure Data Factory with CRON jobs that use AzCopy.

D. Replace Azure Databricks with Azure Machine Learning.

Correct Answer: C
AzCopy is a command-line utility that you can use to copy blobs or les to or from a storage account.
Cron is one of the most useful utility that you can nd in any Unix-like operating system. It is used to schedule commands at a speci c time.
These scheduled commands or tasks are known as "Cron Jobs".
Reference:
https://docs.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-con gure

  DIMONANDREY 2 months ago


Not sure how ADF could be almost 5000 per month if it get only 1gb per day, almost imposible even if it host VM with self hosted IR it will be
cheaper, I maybe better replace Synapse and AAS, but yes if in some case it will take 5000 per month then better replace ADF
upvoted 2 times

  uzairahm007 1 month, 4 weeks ago


i agree another one of those weird questions check the budgets example here talks about 1 TB per day for the entire month and the costs are
so cheap.
https://docs.microsoft.com/en-us/azure/data-factory/plan-manage-costs
upvoted 1 times

  gcpjay 1 month, 3 weeks ago


The question states that "The log files are in a consistent format." So, no ETL is required. The files can be copied directly using the AzCopy
command.
upvoted 19 times

  milind8451 1 week, 5 days ago


Right answer. If log files are consistent then AzCopy is enough to copy data and it will save cost of ADF.
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 103/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

  glam 1 week, 3 days ago


C. Replace Azure Data Factory with CRON jobs that use AzCopy.
upvoted 2 times

  FK2974 4 days, 5 hours ago


I've ask this question with MS trainer and the correct answer is C!!
upvoted 1 times

Question #10 Topic 3

You deploy Azure App Service Web Apps that connect to on-premises Microsoft SQL Server instances by using Azure ExpressRoute. You plan to
migrate the
SQL Server instances to Azure.
Migration of the SQL Server instances to Azure must:
✑ Support automatic patching and version updates to SQL Server.
✑ Provide automatic backup services.
✑ Allow for high-availability of the instances.
✑ Provide a native VNET with private IP addressing.
✑ Encrypt all data in transit.
✑ Be in a single-tenant environment with dedicated underlying infrastructure (compute, storage).
You need to migrate the SQL Server instances to Azure.
Which Azure service should you use?

A. SQL Server in a Docker container running on Azure Container Instances (ACI)

B. SQL Server in Docker containers running on Azure Kubernetes Service (AKS)

C. SQL Server Infrastructure-as-a-Service (IaaS) virtual machine (VM)

D. Azure SQL Database Managed Instance

E. Azure SQL Database with elastic pools

Correct Answer: D
Azure SQL Database Managed Instance con gured for Hybrid workloads. Use this topology if your Azure SQL Database Managed Instance is
connected to your on-premises network. This approach provides the most simpli ed network routing and yields maximum data throughput
during the migration.
Reference:
https://docs.microsoft.com/en-us/azure/dms/resource-network-topologies

  H 1 month, 3 weeks ago


Correct answer
upvoted 7 times

  kopper2019 1 month, 1 week ago


correct
upvoted 1 times

  milind8451 1 week, 5 days ago


Right ans.
upvoted 1 times

  glam 1 week, 3 days ago


D. Azure SQL Database Managed Instance
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 104/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #11 Topic 3

You plan to store data in Azure Blob storage for many years. The stored data will be accessed rarely.
You need to ensure that the data in Blob storage is always available for immediate access. The solution must minimize storage costs.
Which storage tier should you use?

A. Cool

B. Archive

C. Hot

Correct Answer: A
Data in the cool access tier can tolerate slightly lower availability, but still requires high durability, retrieval latency, and throughput
characteristics similar to hot data. For cool data, a slightly lower availability service-level agreement (SLA) and higher access costs compared
to hot data are acceptable trade-offs for lower storage costs.
Incorrect Answers:
B: Archive storage stores data o ine and offers the lowest storage costs but also the highest data rehydrate and access costs.
Archive - Optimized for storing data that is rarely accessed and stored for at least 180 days with exible latency requirements (on the order of
hours).
Reference:
https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-storage-tiers

  nasascientist 1 month, 4 weeks ago


Cool tier is correct answer.. hot tier will be costlier and accessing data from archive tier takes time.
upvoted 10 times

  glam 1 week, 3 days ago


A. Cool
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 105/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #12 Topic 3

DRAG DROP -
You are designing a virtual machine that will run Microsoft SQL Server and will contain two data disks. The rst data disk will store log les, and
the second data disk will store data. Both disks are P40 managed disks.
You need to recommend a caching policy for each disk. The policy must provide the best overall performance for the virtual machine while
preserving integrity of the SQL data and logs.
Which caching policy should you recommend for each disk? To answer, drag the appropriate policies to the correct disks. Each policy may be used
once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
Select and Place:

Correct Answer:

Reference:
https://docs.microsoft.com/en-us/azure/virtual-machines/windows/sql/virtual-machines-windows-sql-performance

  M4gnet1k 2 months ago


The answers provided are correct, here is the explanation and supporting websites:
Log: None—Log files have primarily write-heavy operations. Therefore, they do not benefit from the ReadOnly cache.
https://docs.microsoft.com/en-us/azure/virtual-machines/windows/premium-storage-performance#disk-caching

Data: readonly—If you have separate storage pools for the log and data files, enable read caching only on the storage pool for the data files.
https://docs.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/performance-guidelines-best-practices
upvoted 16 times

  xAlx 2 months ago


Correct
https://docs.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/performance-guidelines-best-practices#disks-guidance

If you are using separate disks for data and log files, enable read caching on the data disks hosting your data files and TempDB data files. This can
result in a significant performance benefit. Do not enable caching on the disk holding the log file as this causes a minor decrease in performance.
upvoted 2 times

  JustDiscussing 1 month, 4 weeks ago


Following are the recommended disk cache settings for data disks,

TABLE 9
Disk caching setting recommendation on when to use this setting
None Configure host-cache as None for write-only and write-heavy disks.
ReadOnly Configure host-cache as ReadOnly for read-only and read-write disks.
ReadWrite Configure host-cache as ReadWrite only if your application properly handles writing cached data to persistent disks when needed.

https://docs.microsoft.com/en-us/azure/virtual-machines/premium-storage-performance#disk-caching
upvoted 1 times

  kopper2019 1 month, 1 week ago


correct
- Use premium SSDs for the best price/performance advantages. Configure Read only cache for data files and no cache for the log file.
https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 106/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

upvoted 1 times

  glam 1 week, 3 days ago


Log: None—
Data: readonly—
upvoted 2 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 107/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #13 Topic 3

You are designing a SQL database solution. The solution will include 20 databases that will be 20 GB each and have varying usage patterns.
You need to recommend a database platform to host the databases. The solution must meet the following requirements:
✑ The compute resources allocated to the databases must scale dynamically.
✑ The solution must meet an SLA of 99.99% uptime.
✑ The solution must have reserved capacity.
✑ Compute charges must be minimized.
What should you include in the recommendation?

A. 20 databases on a Microsoft SQL server that runs on an Azure virtual machine in an availability set

B. 20 instances of Azure SQL Database serverless

C. 20 databases on a Microsoft SQL server that runs on an Azure virtual machine

D. an elastic pool that contains 20 Azure SQL databases

Correct Answer: D
Azure SQL Database elastic pools are a simple, cost-effective solution for managing and scaling multiple databases that have varying and
unpredictable usage demands. The databases in an elastic pool are on a single server and share a set number of resources at a set price.
Elastic pools in Azure SQL Database enable
SaaS developers to optimize the price performance for a group of databases within a prescribed budget while delivering performance elasticity
for each database.
Guaranteed 99.995 percent uptime for SQL Database
Reference:
https://docs.microsoft.com/en-us/azure/azure-sql/database/elastic-pool-overview https://azure.microsoft.com/en-us/pricing/details/sql-
database/elastic/

  xAlx 2 months ago


Correct
upvoted 9 times

  KirubakaraSen 1 month ago


99.95%.?? Requirement is to have 99.99%.. Is this a typo..
upvoted 2 times

  joshyz73 1 month ago


Not a typo. The article linked and the explanation say 99.995%, which is even greater that 99.99% (which is 99.990%), so it's all correct.
upvoted 1 times

  arseyam 1 month ago


- Azure SQL Database Business Critical or Premium tiers configured as Zone Redundant Deployments have an availability guarantee of at least
99.995%.

- Azure SQL Database Business Critical or Premium tiers not configured for Zone Redundant Deployments, General Purpose, Standard, or Basic
tiers, or Hyperscale tier with two or more replicas have an availability guarantee of at least 99.99%.

https://azure.microsoft.com/en-us/support/legal/sla/sql-database/v1_5/
upvoted 3 times

  glam 1 week, 3 days ago


D. an elastic pool that contains 20 Azure SQL databases
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 108/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #14 Topic 3

You have an app named App1 that uses two on-premises Microsoft SQL Server databases named DB1 and DB2.
You plan to migrate DB1 and DB2 to Azure.
You need to recommend an Azure solution to host DB1 and DB2. The solution must meet the following requirements:
✑ Support server-side transactions across DB1 and DB2.
✑ Minimize administrative effort to update the solution.
What should you recommend?

A. two Azure SQL databases in an elastic pool

B. two Azure SQL databases on different Azure SQL Database servers

C. two Azure SQL databases on the same Azure SQL Database managed instance

D. two SQL Server databases on an Azure virtual machine

Correct Answer: C
SQL Managed Instance enables system administrators to spend less time on administrative tasks because the service either performs them for
you or greatly simpli es those tasks.
Note: Azure SQL Managed Instance is designed for customers looking to migrate a large number of apps from an on-premises or IaaS, self-
built, or ISV provided environment to a fully managed PaaS cloud environment, with as low a migration effort as possible. Using the fully
automated Azure Data Migration Service, customers can lift and shift their existing SQL Server instance to SQL Managed Instance, which offers
compatibility with SQL Server and complete isolation of customer instances with native VNet support. With Software Assurance, you can
exchange your existing licenses for discounted rates on SQL Managed Instance using the Azure Hybrid Bene t for SQL Server. SQL Managed
Instance is the best migration destination in the cloud for SQL Server instances that require high security and a rich programmability surface.
Reference:
https://docs.microsoft.com/en-us/azure/azure-sql/managed-instance/sql-managed-instance-paas-overview
Design Data Storage

  xAlx 2 months ago


Correct
https://docs.microsoft.com/en-us/azure/azure-sql/managed-instance/sql-managed-instance-paas-overview#azure-active-directory-integration
upvoted 3 times

  uzairahm007 1 month, 4 weeks ago


Correct Answer in my view is D.
When both the database management system and client are under the same ownership (e.g. when SQL Server is deployed to a virtual machine),
transactions are available and the lock duration can be controlled.
Reference:
https://docs.particular.net/nservicebus/azure/understanding-transactionality-in-azure
upvoted 3 times

  Elecktrus 1 month, 3 weeks ago


and what about 'minimize administrative effort to update the solution'?.
In answer D, you can to install the SQL Server in the Azure Machine (that is expensive and it take a lot of effort).
So, answers C and D can support server-side transactions, but answer C is less expensive and more easy to deploy
upvoted 11 times

  andyR 1 month, 3 weeks ago


agreed, Microsoft is looking for you to recognize lift and shift with reduced administrative effort - Managed Instance
upvoted 2 times

  nesith 1 month, 2 weeks ago


solution need to support server side transactions, as it stand this feature for Azure SQL and Managed instances are in preview
https://docs.microsoft.com/en-us/azure/azure-sql/database/elastic-transactions-overview, so I think intended answer was D
upvoted 3 times

  arseyam 1 month, 1 week ago


Answer is C - From your own link :)
A server-side experience (code written in stored procedures or server-side scripts) using Transact-SQL is available for Managed Instance only.
upvoted 2 times

  zic04 1 month, 1 week ago


D for me
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 109/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

  xaccan 12 hours, 15 minutes ago


you are wrong it is C
upvoted 1 times

  arseyam 1 month, 1 week ago


Answer is C
A server-side experience (code written in stored procedures or server-side scripts) using Transact-SQL is available for Managed Instance only.

https://docs.microsoft.com/en-us/azure/azure-sql/database/elastic-transactions-overview#common-scenarios
upvoted 2 times

  kopper2019 1 month, 1 week ago


yes makes sense
upvoted 1 times

  kopper2019 1 month, 1 week ago


so C is the answer
upvoted 1 times

  bipin24x7 3 weeks, 5 days ago


C is correct. Also, note that requirement says "Minimize administrative effort to update the solution". If we setup VM and SQL Databases then
efforts is more.
upvoted 2 times

  sydon 3 weeks, 1 day ago


D is the right answer.
There is a similar question in AZ-303.
The managed instance does not support the transaction across two databases
upvoted 1 times

  bavoh95047 2 days, 13 hours ago


It does support transaction between two databases in T-SQL, but support for c#/.NET is in preview. Therefore Managed Instance (answer C)
meets the requirement of allowing server-side transaction between two databases.
upvoted 1 times

  RasiR 3 weeks ago


Support server-side transactions across DB1 and DB2 - C and D
Minimize administrative effort to update the solution - C

But, C is still in preview mode.

Ref:
https://docs.microsoft.com/en-us/azure/azure-sql/database/elastic-transactions-overview#transact-sql-development-experience
https://docs.microsoft.com/en-us/azure/azure-sql/database/elastic-transactions-overview#transact-sql-development-experience
upvoted 1 times

  Blaaa 2 weeks, 5 days ago


Correct
upvoted 1 times

  glam 1 week, 3 days ago


C. two Azure SQL databases on the same Azure SQL Database managed instance
upvoted 2 times

Topic 4 - Question Set 4

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 110/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #1 Topic 4

Your company purchases an app named App1.


You plan to run App1 on seven Azure virtual machines in an Availability Set. The number of fault domains is set to 3. The number of update
domains is set to 20.
You need to identify how many App1 instances will remain available during a period of planned maintenance.
How many App1 instances should you identify?

A. 1

B. 2

C. 6

D. 7

Correct Answer: C
Only one update domain is rebooted at a time. Here there are 7 update domain with one VM each (and 13 update domain with no VM).
Reference:
https://docs.microsoft.com/en-us/azure/virtual-machines/windows/manage-availability

  speedminer 5 months ago


Seems accurate based on link provided.
upvoted 8 times

  Yamaneko 3 months, 2 weeks ago


There's no info on how many VM is running, how do you know only 7?
upvoted 1 times

  rbnrecio 3 months, 2 weeks ago


Totally agree with you, I think in this question there is not total info.
upvoted 1 times

  pentum7 3 months, 1 week ago


"You plan to run App1 on SEVEN Azure virtual machines in an Availability Set."
upvoted 16 times

  i_am_gopinath 1 week, 5 days ago


good catch on keyword "SEVEN"
upvoted 2 times

  Virendrak 3 months ago


Updated domain is defined to 20 so at a time only one machine will reboot so always 6 machine will be available. Since the number of virtual
machines in availability sets are 7. So only one machine will reboot on update domains 7 or more.
upvoted 8 times

  TestUser911 2 months, 3 weeks ago


correct
upvoted 2 times

  gm20 2 months, 2 weeks ago


VM|FD|UD
VM0|0|0
VM1|1|1
VM2|2|2
VM3|0|3
VM4|1|4
VM5|2|5
VM6|0|6

In summary, we have Update Domain 0(UD0): 6 VM


Update Domain 1(UD1): 6 VM
Update Domain 2(UD2): 6 VM
upvoted 9 times

  Hanger_Man 2 months, 1 week ago


correct answer given-6
upvoted 4 times

  StressiOlli 2 months ago

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 111/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Just look how often the 3 fits into the 20 => 6. Simple mathematics. Just an Eselsbrücke. :)
upvoted 4 times

  m9878 4 weeks ago


fault domain only come in play during unplanned maintenance, the question asks about how many machines will be available during a planned
maintenance so the solution is 6 not 3
upvoted 2 times

  joshyz73 1 month ago


That's not how you come to the answer at all. The number of fault domains (which is how many physical racks the VMs would be spread out on)
is meaningless in this question. The answer would be 6 even if there were 200 update domains. There are 7 running VMs. As long as the number
of update domains is >= the number of the VMs, you will never have more than 1 VM offline at any given time.
upvoted 4 times

  ArifUK 1 month ago


why 3 ?
upvoted 1 times

  Mouminster 1 month ago


because fault domain is 3
upvoted 1 times

  AJ_123 2 weeks, 2 days ago


Given answer is correct
upvoted 2 times

  milind8451 1 week, 5 days ago


There are 20 update domains, and app runs on 7 VMs so if you distribute 7 VMs across 20 update domains then still you have 13 slots left, If 1
update domain is updated at once then still 6 VMs are running at any time. Answer is correct.
upvoted 1 times

  glam 1 week, 3 days ago


C. 6 .....
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 112/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #2 Topic 4

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that
might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have an Azure Storage v2 account named storage1.
You plan to archive data to storage1.
You need to ensure that the archived data cannot be deleted for ve years. The solution must prevent administrators from deleting the data.
Solution: You create an Azure Blob storage container, and you con gure a legal hold access policy.
Does this meet the goal?

A. Yes

B. No

Correct Answer: B
Use an Azure Blob storage container, but use a time-based retention policy instead of a legal hold.
Note:
Immutable storage for Azure Blob storage enables users to store business-critical data objects in a WORM (Write Once, Read Many) state. This
state makes the data non-erasable and non-modi able for a user-speci ed interval. For the duration of the retention interval, blobs can be
created and read, but cannot be modi ed or deleted. Immutable storage is available for general-purpose v2 and Blob storage accounts in all
Azure regions.
Note: Set retention policies and legal holds
1. Create a new container or select an existing container to store the blobs that need to be kept in the immutable state. The container must be
in a general- purpose v2 or Blob storage account.
2. Select Access policy in the container settings. Then select Add policy under Immutable blob storage.

Either -
3a. To enable legal holds, select Add Policy. Select Legal hold from the drop-down menu.

Or -
3b. To enable time-based retention, select Time-based retention from the drop-down menu.
4. Enter the retention interval in days (acceptable values are 1 to 146000 days).
Reference:
https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-immutable-storage https://docs.microsoft.com/en-
us/azure/storage/blobs/storage-blob-immutability-policies-manage

  speedminer 5 months ago


Seems like the very next question/answer seems to say this is correct.
upvoted 1 times

  speedminer 5 months ago


Well...I guess the steps indicate that you would first create a time based retention policy, then a legal hold....so I'm unsure if this step assumes
you'd make the time based retention policy first like the steps in https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-
immutability-policies-manage?tabs=azure-portal
or just make a legal hold.
upvoted 1 times

  speedminer 5 months ago


Legal hold: Immutable storage for Azure Blob storage enables users to store sensitive information that is critical to litigation or business use
in a tamper-proof state for the desired duration until the hold is removed. This feature is not limited only to legal use cases but can also be
thought of as an event-based hold or an enterprise lock, where the need to protect data based on event triggers or corporate policy is
required.

Legal hold policy support: If the retention interval is not known, users can set legal holds to store immutable data until the legal hold is
cleared. When a legal hold policy is set, blobs can be created and read, but not modified or deleted. Each legal hold is associated with a
user-defined alphanumeric tag (such as a case ID, event name, etc.) that is used as an identifier string.

https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-immutable-storage
upvoted 2 times

  speedminer 5 months ago


Based on that info, it does seem that turning on a legal hold would prevent the data from being deleted until the legal hold policy was
removed, even without the time retention policy.
https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 113/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

upvoted 1 times

  speedminer 5 months ago


This should accomplish the requirements. The legal hold policy when applied to the data will both store it/preserve it from being
deleted by anyone.
upvoted 1 times

  Jonnerzzz 3 months, 2 weeks ago


Hi Speedminer. The requirement is to ensure data retention for 5 years. The legal hold can be applied and removed - so the data
can then be deleted. As such, this approach does not meet the requirement and therefore the answer is B: No
upvoted 3 times

  mchint01 4 months ago


If time interval is specified - go for time based rentention policy
else go for legal hold. Since time is specified here, answer would be yes..

Time-based retention policy support: Users can set policies to store data for a specified interval. When a time-based retention policy is set, blobs
can be created and read, but not modified or deleted. After the retention period has expired, blobs can be deleted but not overwritten.

Legal hold policy support: If the retention interval is not known, users can set legal holds to store immutable data until the legal hold is cleared.
When a legal hold policy is set, blobs can be created and read, but not modified or deleted. Each legal hold is associated with a user-defined
alphanumeric tag (such as a case ID, event name, etc.) that is used as an identifier string.

https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-immutable-
storage#:~:text=Time%2Dbased%20retention%20policy%20support,but%20not%20modified%20or%20deleted.&text=When%20a%20legal%20hol
d%20policy,but%20not%20modified%20or%20deleted.
upvoted 4 times

  certmonster 3 months, 2 weeks ago


So it should be NO, right, because time is specified?
upvoted 2 times

  pentum7 3 months, 1 week ago


correct
upvoted 1 times

  Vippsy 3 months, 2 weeks ago


The answer is No. Legal Hold will do the job, but Microsoft best practice says that if you know the time period then create a Time Based Retention
Policy. If you dont know how long the data needs to be kept without being able to modify or delete then create a Legal Hold.
upvoted 4 times

  BoxMan 2 months, 3 weeks ago


No is correct. The question clearly states that even Admins can't delete the data.

With a legal hold the tags can be removed and therefore the data can be modified/deleted.

With a time-based policy the data is immutable.

FAQs on this pose the question with clear responses:

https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-immutable-storage#faq

"Can I remove a locked time-based retention policy or legal hold?


Only unlocked time-based retention policies can be removed from a container. Once a time-based retention policy is locked, it cannot be removed;
only effective retention period extensions are allowed. Legal hold tags can be deleted. When all legal tags are deleted, the legal hold is removed."
upvoted 7 times

  nesith 1 month, 2 weeks ago


however, once the retention period is over , you can delete the blob but can't edit it
upvoted 1 times

  Blaaa 2 weeks, 5 days ago


Correct
upvoted 1 times

  glam 1 week, 3 days ago


B. No
Time-based retention policy support is required
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 114/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #3 Topic 4

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that
might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have an Azure Storage v2 account named storage1.
You plan to archive data to storage1.
You need to ensure that the archived data cannot be deleted for ve years. The solution must prevent administrators from deleting the data.
Solution: You create a le share and snapshots.
Does this meet the goal?

A. Yes

B. No

Correct Answer: B
Instead you could create an Azure Blob storage container, and you con gure a legal hold access policy.
Reference:
https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-immutable-storage

  speedminer 5 months ago


https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-immutability-policies-manage?tabs=azure-portal
And instead create a time-based retention policy....
upvoted 12 times

  Vippsy 3 months, 2 weeks ago


The answer is correct but the description given is wrong.
upvoted 6 times

  glam 1 week, 3 days ago


B. No
Time-based retention policy support required
upvoted 2 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 115/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #4 Topic 4

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that
might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have an Azure Storage v2 account named storage1.
You plan to archive data to storage1.
You need to ensure that the archived data cannot be deleted for ve years. The solution must prevent administrators from deleting the data.
Solution: You create a le share, and you con gure an access policy.
Does this meet the goal?

A. Yes

B. No

Correct Answer: B
Instead of a le share, an immutable Blob storage is required.
Time-based retention policy support: Users can set policies to store data for a speci ed interval. When a time-based retention policy is set,
blobs can be created and read, but not modi ed or deleted. After the retention period has expired, blobs can be deleted but not overwritten.
Note: Set retention policies and legal holds
1. Create a new container or select an existing container to store the blobs that need to be kept in the immutable state. The container must be
in a general- purpose v2 or Blob storage account.
2. Select Access policy in the container settings. Then select Add policy under Immutable blob storage.
3. To enable time-based retention, select Time-based retention from the drop-down menu.
4. Enter the retention interval in days (acceptable values are 1 to 146000 days).
Reference:
https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-immutable-storage https://docs.microsoft.com/en-
us/azure/storage/blobs/storage-blob-immutability-policies-manage

  Nian 1 month, 3 weeks ago


I would say yes - An access policy is required to enable the time-based rentention policy
upvoted 1 times

  milind8451 1 week, 5 days ago


Data retention Access policy can be enabled only on Blob storage, not on any other type of storage.
upvoted 1 times

  gisbern 1 month, 2 weeks ago


Blob container, not a file share.
upvoted 2 times

  glam 1 week, 3 days ago


B. No
Instead of a file share, an immutable Blob storage is required and Time-based retention policy support
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 116/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #5 Topic 4

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that
might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have an on-premises Hyper-V cluster that hosts 20 virtual machines. Some virtual machines run Windows Server 2016 and some run Linux.
You plan to migrate the virtual machines to an Azure subscription.
You need to recommend a solution to replicate the disks of the virtual machines to Azure. The solution must ensure that the virtual machines
remain available during the migration of the disks.
Solution: You recommend implementing an Azure Storage account, and then running AzCopy.
Does this meet the goal?

A. Yes

B. No

Correct Answer: B
AzCopy only copy les, not the disks.
Instead use Azure Site Recovery.
Reference:
https://docs.microsoft.com/en-us/azure/site-recovery/site-recovery-overview

  speedminer 5 months ago


https://docs.microsoft.com/en-us/azure/storage/common/storage-use-azcopy-v10
Seems accurate
upvoted 2 times

  Jonnerzzz 3 months, 2 weeks ago


The answer is correct, but use Azure Migrate instead of Azure Site Recovery (which is for BCDR purposes). This is a migration strategy and can be
implemented without disrupting operations. https://docs.microsoft.com/en-us/azure/migrate/tutorial-migrate-hyper-v#replicate-hyper-v-vms
upvoted 4 times

  RandomUser 2 months, 2 weeks ago


It might be nitpicking, but of course you can use AzCopy to move the disks to Azure while they are attached using Volume Shadow Copy. But in
contrast to using Azure Migrate, you will loose everything that happened after this snapshot.
upvoted 1 times

  glam 1 week, 3 days ago


B. No ....
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 117/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #6 Topic 4

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that
might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have an on-premises Hyper-V cluster that hosts 20 virtual machines. Some virtual machines run Windows Server 2016 and some run Linux.
You plan to migrate the virtual machines to an Azure subscription.
You need to recommend a solution to replicate the disks of the virtual machines to Azure. The solution must ensure that the virtual machines
remain available during the migration of the disks.
Solution: You recommend implementing an Azure Storage account that has a le service and a blob service, and then using the Data Migration
Assistant.
Does this meet the goal?

A. Yes

B. No

Correct Answer: B
Data Migration Assistant is used to migrate SQL databases.
Instead use Azure Site Recovery.
Reference:
https://docs.microsoft.com/en-us/azure/site-recovery/site-recovery-overview

  speedminer 5 months ago


https://docs.microsoft.com/en-us/azure/cosmos-db/import-data
Seems accurate
upvoted 2 times

  Jonnerzzz 3 months, 2 weeks ago


Again, the answer is correct, but use Azure Migrate instead of Azure Site Recovery (which is for BCDR purposes). This is a migration strategy and
can be implemented without disrupting operations. https://docs.microsoft.com/en-us/azure/migrate/tutorial-migrate-hyper-v#replicate-hyper-v-
vms
upvoted 9 times

  glam 1 week, 3 days ago


B. No ....
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 118/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #7 Topic 4

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that
might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have an on-premises Hyper-V cluster that hosts 20 virtual machines. Some virtual machines run Windows Server 2016 and some run Linux.
You plan to migrate the virtual machines to an Azure subscription.
You need to recommend a solution to replicate the disks of the virtual machines to Azure. The solution must ensure that the virtual machines
remain available during the migration of the disks.
Solution: You recommend implementing a Recovery Services vault, and then using Azure Site Recovery.
Does this meet the goal?

A. Yes

B. No

Correct Answer: A
Site Recovery can replicate on-premises VMware VMs, Hyper-V VMs, physical servers (Windows and Linux), Azure Stack VMs to Azure.
Note: Site Recovery helps ensure business continuity by keeping business apps and workloads running during outages. Site Recovery replicates
workloads running on physical and virtual machines (VMs) from a primary site to a secondary location. When an outage occurs at your primary
site, you fail over to secondary location, and access apps from there. After the primary location is running again, you can fail back to it.
Reference:
https://docs.microsoft.com/en-us/azure/site-recovery/site-recovery-overview

  KumarPV 1 month, 3 weeks ago


Agreed. 'Yes' is right answer.
upvoted 2 times

  bingomutant 1 month, 2 weeks ago


It would do the job but I would say answer is No as there are solutions which better satisfy the requirements - especially using Azure Migrate .
upvoted 1 times

  WillCloud 1 week, 5 days ago


Should be Yes.
This article describes how to enable replication for on-premises VMware VMs, for disaster recovery to Azure using the Azure Site Recovery
service.
https://docs.microsoft.com/en-us/azure/site-recovery/vmware-azure-tutorial
upvoted 1 times

  buanilk 2 weeks, 4 days ago


Correct.
Azure site recovery supports managed disk & can be attached to VM during failover or migration from on-prem server.
upvoted 1 times

  milind8451 1 week, 5 days ago


Answer is correct but Azure Migrate best fits for migration while ASR is for BCP/DR but both can be used for replicating VMs from On-prem to
Azure.
upvoted 2 times

  bavoh95047 2 days, 13 hours ago


Yeah, in case there would be an option for Azure Migrate, that would be the best choice. But for the given choices, ASR seems the best fit. All in
all, Migrate uses ASR under the hood.
upvoted 1 times

  glam 1 week, 3 days ago


A. Yes .....
upvoted 1 times

  xaccan 12 hours ago


Before, Microsoft's advice was to use Azure Site Recovery for migrations. In the previous Azure exam (AZ-301), you would've been tested on ASR
migrations, instead of Azure Migrate. In the updated exam (AZ-304), they now test on Azure Migrate - so you can see that Azure Migrate is what
they're suggesting people use for migrations nowadays.

The answer is correct here (Azure Site Recovery) because the question is from previous exam 301 and not 304.
upvoted 1 times

  xaccan 11 hours, 57 minutes ago

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 119/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

By the way this question will not appear at all in the exam. it is old and for 304 they suggest to use azure migrate.
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 120/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #8 Topic 4

You are designing a storage solution that will use Azure Blob storage. The data will be stored in a cool access tier or an archive access tier based
on the access patterns of the data.
You identify the following types of infrequently accessed data:
✑ Telemetry data: Deleted after two years
✑ Promotional material: Deleted after 14 days
✑ Virtual machine audit data: Deleted after 200 days
A colleague recommends using the archive access tier to store the data.
Which statement accurately describes the recommendation?

A. Storage costs will be based on a minimum of 30 days.

B. Access to the data is guaranteed within ve minutes.

C. Access to the data is guaranteed within 30 minutes.

D. Storage costs will be based on a minimum of 180 days.

Correct Answer: D
The following table shows a comparison of premium performance block blob storage, and the hot, cool, and archive access tiers.

Reference:
https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-storage-tiers

  KumarPV 1 month, 3 weeks ago


Very good explanation!
upvoted 7 times

  glam 1 week, 3 days ago


D. Storage costs will be based on a minimum of 180 days.
upvoted 1 times

  Gilmour 10 hours, 44 minutes ago


Does not make sense. Data in Cool storage tier should stay for at least 30 days. So if its lifespan is shorter than 30 days, as is the case with
Promotional Material, penulties will be incurred because the more expensive storage tier (Hot), should have been used.
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 121/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #9 Topic 4

You are planning to deploy an application named App1 that will run in containers on Azure Kubernetes Service (AKS) clusters. The AKS clusters
will be distributed across four Azure regions.
You need to recommend a storage solution for App1. Updated container images must be replicated automatically to all the AKS clusters.
Which storage solution should you recommend?

A. Azure Cache for Redis

B. Azure Content Delivery Network (CDN)

C. Premium SKU Azure Container Registry

D. geo-redundant storage (GRS) accounts

Correct Answer: C
Enable geo-replication for container images.
Best practice: Store your container images in Azure Container Registry and geo-replicate the registry to each AKS region.
To deploy and run your applications in AKS, you need a way to store and pull the container images. Container Registry integrates with AKS, so it
can securely store your container images or Helm charts. Container Registry supports multimaster geo-replication to automatically replicate
your images to Azure regions around the world.
Geo-replication is a feature of Premium SKU container registries.
Note:
When you use Container Registry geo-replication to pull images from the same region, the results are:
Faster: You pull images from high-speed, low-latency network connections within the same Azure region.
More reliable: If a region is unavailable, your AKS cluster pulls the images from an available container registry.
Cheaper: There's no network egress charge between datacenters.
Reference:
https://docs.microsoft.com/en-us/azure/aks/operator-best-practices-multi-region

  speedminer 5 months ago


https://docs.microsoft.com/en-us/azure/aks/operator-best-practices-multi-region
Seems accurate
upvoted 5 times

  speedminer 5 months ago


Geo-Replication is a premium SKU container registry feature
upvoted 4 times

  milind8451 1 week, 5 days ago


Azure Container Registry is used to store images for AKS so the answer is correct.
upvoted 1 times

  glam 1 week, 3 days ago


C. Premium SKU Azure Container Registry
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 122/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #10 Topic 4

You have an on-premises network and an Azure subscription. The on-premises network has several branch o ces.
A branch o ce in Toronto contains a virtual machine named VM1 that is con gured as a le server. Users access the shared les on VM1 from all
the o ces.
You need to recommend a solution to ensure that the users can access the shared les as quickly as possible if the Toronto branch o ce is
inaccessible.
What should you include in the recommendation?

A. an Azure le share and Azure File Sync

B. a Recovery Services vault and Windows Server Backup

C. a Recovery Services vault and Azure Backup

D. Azure blob containers and Azure File Sync

Correct Answer: Incorrect Answer:


Use Azure File Sync to centralize your organization's le shares in Azure Files, while keeping the exibility, performance, and compatibility of an
on-premises le server. Azure File Sync transforms Windows Server into a quick cache of your Azure le share.
You need an Azure le share in the same region that you want to deploy Azure File Sync.
C: Backups would be a slower solution.
Reference:
https://docs.microsoft.com/en-us/azure/storage/ les/storage-sync- les-deployment-guide

  speedminer 5 months ago


According to https://docs.microsoft.com/en-us/azure/storage/files/storage-sync-files-deployment-guide

It should be A
upvoted 38 times

  levianthan 4 months, 2 weeks ago


I'd pick A from the given options, but honestly, it's time to migrate to SharePoint Online. Access from any other office than Toronto must be a
nightmare anyway.
upvoted 9 times

  andyR 1 month, 3 weeks ago


mentioning SharePoint - sends a shiver down my spine
upvoted 7 times

  pentum7 2 months, 4 weeks ago


LOL, couldn't agree more; The CTO must be fired from this company
upvoted 5 times

  rngodev 3 months, 4 weeks ago


The answer is A
upvoted 12 times

  dwild 3 months ago


Answer is A final answer
upvoted 10 times

  KirubakaraSen 1 month ago


Answer not selected. Exam Topics support team, please check this.
upvoted 4 times

  milind8451 1 week, 4 days ago


Ans is A) an Azure file share and Azure File Sync. You need Azure File sync to sync your file share on-prem to Azure File share so users can access it
from Azure.
upvoted 1 times

  glam 1 week, 3 days ago


A. an Azure file share and Azure File Sync
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 123/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #11 Topic 4

DRAG DROP -
The developers at your company are building a static web app to support users sending text messages. The app must meet the following
requirements:
✑ Website latency must be consistent for users in different geographical regions.
✑ Users must be able to authenticate by using Twitter and Facebook.
✑ Code must include only HTML, native JavaScript, and jQuery.
✑ Costs must be minimized.
Which Azure service should you use to complete the architecture? To answer, drag the appropriate services to the correct locations. Each service
may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
Select and Place:

Correct Answer:

Box 1: Azure App Service plan (Basic)


With App Service you can authenticate your customers with Azure Active Directory, and integrate with Facebook, Twitter, Google.

Box 2: Azure Functions -


You can send SMS messages with Azure Functions with Javascript.
Reference:
https://docs.microsoft.com/en-us/azure/active-directory-b2c/partner-whoiam https://www.codeproject.com/Articles/1368337/Implementing-
SMS-API-using-Azure-Serverless-Functi

  GeorgeKir 2 months ago


Azure CDN not app service.
upvoted 17 times

  idris 2 months ago


Agree.
The key part here is "..a static web app to...". And also only client-side script is required. And latency across geographies expected. Should be
Azure CDN

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 124/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

upvoted 2 times

  pattasana 2 months ago


I'll go with Azure Logic app for the second box
upvoted 5 times

  uzairahm007 1 month, 4 weeks ago


I think Logic app is more drag and drop and the requirement dictates:
" Code must include only HTML, native JavaScript, and jQuery."
and you could develop Azure Functions in native Javascript
https://docs.microsoft.com/en-us/azure/azure-functions/functions-reference-node?tabs=v2
upvoted 10 times

  tmurfet 1 month, 3 weeks ago


Agreed, if there is code involved then it's Azure Function not Logic App.
upvoted 6 times

  Elecktrus 1 month, 3 weeks ago


Answer 1 is CDN, because the minimize cost, latency accross regions. In addition, you must use only html, javascript and jquery (that is, client-site
code).
CDN support html/javascript without problem - https://docs.microsoft.com/en-us/javascript/api/overview/azure/cdn?view=azure-node-
latest#overview
upvoted 10 times

  shapgi 1 month, 2 weeks ago


Whats the final answer??
upvoted 2 times

  nesith 1 month, 2 weeks ago


Azure CDN, Function, the service needed to use sms from Logic App is still in public preview, in any case it would mean additional cost compare
to using a function
upvoted 2 times

  Rayrichi 1 month, 2 weeks ago


1. Azure CDB because of the following:
-> Website latency must be consistent for users in different geographical regions.
-> Users must be able to authenticate by using Twitter and Facebook.
-> Code must include only HTML, native JavaScript, and jQuery.
-> Costs must be minimized.

2. Azure Logic app to Send SMS


-> Costs must be minimized.
upvoted 5 times

  Azure_Chief 1 month, 2 weeks ago


This website is hosted using storage (the box in the middle), so according to this documentation, you would need a CDN.
https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-static-website#multi-region-website-hosting
upvoted 3 times

  jhoomtv 1 month, 1 week ago


but how users will authenticate, for example: You need to host you auth logic code some where unless you have javascript auth function. Looks
like its does - https://techcommunity.microsoft.com/t5/azure-active-directory-identity/azure-ad-b2c-now-has-javascript-customization-and-
many-more-new/ba-p/353595
upvoted 2 times

  WillCloud 1 week, 5 days ago


The diagram has "Azure AD B2C" already for this.
Tutorial: Add identity providers to your applications in Azure Active Directory B2C
https://docs.microsoft.com/en-us/azure/active-directory-b2c/tutorial-add-identity-providers
upvoted 1 times

  jhoomtv 1 month, 1 week ago


Azure Functions can send text message (SMS): https://docs.microsoft.com/en-us/learn/modules/send-location-over-sms-using-azure-functions-
twilio/

App Logic can also do: https://docs.microsoft.com/en-us/azure/communication-services/quickstarts/telephony-sms/logic-app


upvoted 1 times

  tanzaman 1 month, 1 week ago


Answer should be CDN and Function
upvoted 4 times

  kopper2019 1 month, 1 week ago


it should CND and Logic App
upvoted 2 times

  fudu101 1 month, 1 week ago

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 125/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

how would the SMS service be triggered thats key to choose from Functions or Logic apps.
upvoted 2 times

  fudu101 1 month, 1 week ago


logic apps for second make sense, both can gets triggered by EvenGrid, both can use Twillio to send sms, there is Native javascript condition, so
when you write function thats not native JS. also in Logic app, no code is required to send the sms. so Logic apps ticks all the boxes here.
upvoted 2 times

  kellyrogers556 1 month ago


https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-static-website
upvoted 1 times

  fredam2020 4 weeks ago


remember there's cost minimized requirement, so correct
upvoted 1 times

  bbartek 2 weeks, 6 days ago


I don't think app service would be cheaper option than serving static app with CDN
upvoted 1 times

  bipin24x7 3 weeks, 5 days ago


Azure CDN, Function - The reason for Function is 1) It can be called easily with Http Trigger from website. With consumption plan, cost can be
reduced compared to Logic App.
upvoted 1 times

  bipin24x7 3 weeks, 5 days ago


Also, please review the link: https://docs.microsoft.com/en-us/learn/modules/send-location-over-sms-using-azure-functions-twilio/
upvoted 1 times

  zeeshankaizer 2 weeks, 6 days ago


What is cheaper? Logic APP or Azure functions? I guess that will decide the second part
upvoted 1 times

  Blaaa 2 weeks, 5 days ago


CDN and function
upvoted 1 times

  SAMBIT 2 weeks, 4 days ago


Answer has to be: Function, Logic App. Function because you can configure authentication means to B2B and pick Facebook or Twitter. And Logic
app because that minimizes the complexity to implement the SMS sending using connectors.
upvoted 2 times

  WillCloud 1 week, 5 days ago


1: CDN, to meet "Website latency must be consistent for users in different geographical regions"
2: Functions: to meet " Code must include only HTML, native JavaScript, and jQuery"

For " Users must be able to authenticate by using Twitter and Facebook.", actually the diagram has "Azure AD B2C" already for this.
https://docs.microsoft.com/en-us/azure/active-directory-b2c/tutorial-add-identity-providers
upvoted 2 times

  jonasis 1 week, 4 days ago


Agree with this, the question doesn't say anything about number of requests so we cannot compare costs of Logic Apps vs Functions. jQuery +
HTML + JS is the keyword here which points us to Function Apps
upvoted 1 times

  milind8451 1 week, 4 days ago


The boxes say that Website is already hosted in Azure Storage and one statement says "Website latency must be consistent for users in different
geographical regions" so Azure CDN is perfect fit here. Azure app service shouldn't be used for static websites as Azure storage is cheaper and can
do the same job.

https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-static-website

For 2nd box practically Azure Logic app and FUnction both can be used to send SMS but since we can trigger sms using code (statement - Code
must include only HTML, native JavaScript, and jQuery) in Functions so that is preferred solution in this scenario.
upvoted 3 times

  glam 1 week, 3 days ago


Box 1: CDN
Box 2: Azure Functions
upvoted 2 times

  CeliaZhou 4 days, 11 hours ago


You can build and deploy a static website to Azure Storage.https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-static-website-
host
Then use CDN and function to guarantee latency accross regions and send SMS.
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 126/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

  FK2974 4 days, 5 hours ago


correct is bellow!!
1. CDN ==> use for authenticate by using Twitter and Facebook.
2. Azure Function
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 127/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #12 Topic 4

You need to design a highly available Azure SQL database that meets the following requirements:
✑ Failover between replicas of the database must occur without any data loss.
✑ The database must remain available in the event of a zone outage.
✑ Costs must be minimized.
Which deployment option should you use?

A. Azure SQL Database Standard

B. Azure SQL Database Business Critical

C. Azure SQL Database Managed Instance Business Critical

D. Azure SQL Database Basic

Correct Answer: A
Standard geo-replication is available with Standard and Premium databases in the current Azure Management Portal and standard APIs.
Incorrect:
Not B: Business Critical service tier is designed for applications that require low-latency responses from the underlying SSD storage (1-2 ms in
average), fast recovery if the underlying infrastructure fails, or need to off-load reports, analytics, and read-only queries to the free of charge
readable secondary replica of the primary database.
Note: Azure SQL Database and Azure SQL Managed Instance are both based on SQL Server database engine architecture that is adjusted for
the cloud environment in order to ensure 99.99% availability even in the cases of infrastructure failures. There are three architectural models
that are used:
✑ General Purpose/Standard

Business Critical/Premium -

✑ Hyperscale
Reference:
https://docs.microsoft.com/en-us/azure/azure-sql/database/service-tier-business-critical

  ramjisriram 1 month, 2 weeks ago


Seems correct.
upvoted 1 times

  Jinder 2 weeks, 3 days ago


Given answer is correct. Refer below copied for quick refernce from the link ( https://docs.microsoft.com/en-us/azure/azure-sql/database/high-
availability-sla)

There are two high availability architectural models:

Standard availability model that is based on a separation of compute and storage. It relies on high availability and reliability of the remote
storage tier. This architecture targets budget-oriented business applications that can tolerate some performance degradation during
maintenance activities.
Premium availability model that is based on a cluster of database engine processes. It relies on the fact that there is always a quorum of
available database engine nodes. This architecture targets mission critical applications with high IO performance, high transaction rate and
guarantees minimal performance impact to your workload during maintenance activities.
upvoted 1 times

  shaiju 1 month, 1 week ago


Must be business critical as it supports zone redundancy and dynamic scaling
upvoted 7 times

  danwr81 1 month, 1 week ago


My understanding is that with standard, and no other info provided that the data is stored with LRS, no data loss but no zone redundancy. To get
inherent availability for zone outages you would need to use Business Critical which supports AZs. Geo-replication is more of a DR solution for
regional outages?
upvoted 2 times

  itbrpl 1 month, 1 week ago


but that would not be cost effective as for SQL Standard you also need the SQL server. wouldn't it be better to select managed instance? that
sounds more microsoft style..
upvoted 1 times

  kopper2019 1 month, 1 week ago


https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 128/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Answer is B, I just tested


also check this table this took me to the answer
https://docs.microsoft.com/en-us/azure/azure-sql/database/service-tiers-general-purpose-business-critical#service-tier-comparison

fast failover only applies for biz critical and only azure sql support and
Would you like to make this database zone redundant? is only available for normal Azure SQL and not for managed intances
upvoted 9 times

  shapgi 1 month, 1 week ago


Option "B" will work, but to reduce the coast answer "A" is enough.
upvoted 2 times

  arseyam 1 month ago


The key here is "Failover between replicas of the database must occur without any data loss." which cannot be achieved by geo-replication
feature.
upvoted 2 times

  arseyam 1 month ago


Geo-replication is Asynchronous replication which means the replica is lagging behind the primary database for a small period of time
while a zone redundant database has 3 synchronous copies of the database distributes over three zones.
This means in the state of zone outage, only zone redundant database guarantees no data loss.
upvoted 2 times

  jhoomtv 1 month, 1 week ago


Standard DTU model doesn't support zone redundancy
Premium DTU model supports zone redundancy - but thats not the option to pick in the answers so, the answer is B "business critical - vcore
model"
upvoted 6 times

  Lexa 1 month ago


Using geo-redundancy the database will remain available in the event of a zone outage, so the conditions are satisfied.
upvoted 1 times

  ZW9 1 month, 1 week ago


Answer A is correct. General purpose support zone redundancy (preview)
https://docs.microsoft.com/en-us/azure/azure-sql/database/high-availability-sla#general-purpose-service-tier-zone-redundant-availability-preview
upvoted 1 times

  tmfahim 1 month ago


Can a preview feature is accepted as answer?
upvoted 1 times

  nutouch 2 weeks, 1 day ago


we are not being tested with preview features.
upvoted 2 times

  DragonBlake 1 month, 1 week ago


The General Purpose service tier is a default service tier in Azure SQL Database and Azure SQL Managed Instance that is designed for most of
generic workloads. If you need a fully managed database engine with 99.99% SLA with storage latency between 5 and 10 ms that match SQL Server
on an Azure virtual machine in most of the cases, the General Purpose tier is the option for you
https://docs.microsoft.com/en-us/azure/azure-sql/database/service-tier-general-purpose
A is correct
upvoted 1 times

  Ziegler 3 weeks, 3 days ago


B is the correct answer based on this
Two service tiers are used by Azure SQL Database and Azure SQL Managed Instance, each with a different architectural model. These service tiers
are:

General purpose, which is designed for budget-oriented workloads.


Business critical, which is designed for low-latency workloads with high resiliency to failures and fast failovers.
Azure SQL Database has an additional service tier:

Hyperscale, which is designed for most business workloads, providing highly scalable storage, read scale-out, and fast database restore capabilities.
https://docs.microsoft.com/en-us/azure/azure-sql/database/service-tiers-general-purpose-business-critical#service-tier-comparison
upvoted 1 times

  Ivjo 3 weeks, 1 day ago


I would go with Azure SQL Database Business Critical ,as I haven't found any proof, that Standard SQL database supports zone redundancy
(General purpose is not equal to SQL standard database)
upvoted 3 times

  WillCloud 1 week, 5 days ago


B

According to https://docs.microsoft.com/en-us/azure/azure-sql/database/high-availability-sla:
1. Azure SQL Database Standard: General Purpose service tier zone redundant availability is under Preview status.

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 129/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

2. Azure SQL Database Business Critical: Premium and Business Critical service tier zone redundant availability.
3. Azure SQL Database Managed Instance Business Critical: This feature(Zone Redundant) is not available in SQL Managed Instance.
upvoted 2 times

  i_am_gopinath 1 week, 5 days ago


It seems Answer is B
https://azure.microsoft.com/en-us/blog/understanding-and-leveraging-azure-sql-database-sla/
upvoted 1 times

  milind8451 1 week, 4 days ago


There are 2 type of licensing for Azure SQL. DTU based purchase (Basic, Standard, Premium) and vCore based purchase (General Purpose,
Hyperscale, Business Critical). If you go to an Azure DB in portal and select blade configure, you can see these licensing and there on the bottom
you can check which one of these supports Zone redundancy. I don't find Zone redundancy in Basic and Standard versions but it is available for
General purpose and Business Critical. so B) Business Critical is the right answer. Check yourself for any doubts.
upvoted 1 times

  glam 1 week, 3 days ago


B. Azure SQL Database Business Critical
upvoted 2 times

  CeliaZhou 4 days, 11 hours ago


From this article, it seems B is the correct answer.
https://docs.microsoft.com/en-us/azure/azure-sql/database/high-availability-sla
upvoted 1 times

  betamode 4 days, 9 hours ago


A is not the correct option because leveraging Geo-replication may lead to data loss as Geo-replication is asynchronous. As per the question,
requirement is to have zero data loss even after zone outage. Zero data loss is possible with Synchronous replication which is provided with option
B. Option B provides both Zone redundancy with zero data loss whereas option A provides zone redundancy (along with Geo redundancy) with
potential data loss. If there was no requirement of zero data loss then option A could have been the right choice.
upvoted 1 times

  FK2974 4 days, 5 hours ago


C is correct in my opinion due to replicas of the database must occur without any data loss!!
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 130/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #13 Topic 4

DRAG DROP -
Your company identi es the following business continuity and disaster recovery objectives for virtual machines that host sales, nance, and
reporting applications in the company's on-premises data center:
✑ The sales application must be able to fail over to a second on-premises data center.
✑ The nance application requires that data be retained for seven years. In the event of a disaster, the application must be able to run from
Azure. The recovery time objective (RTO) is 10 minutes.
✑ The reporting application must be able to recover point-in-time data at a daily granularity. The RTO is eight hours.
You need to recommend which Azure services meet the business continuity and disaster recovery objectives. The solution must minimize costs.
What should you recommend for each application? To answer, drag the appropriate services to the correct applications. Each service may be used
once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
Select and Place:

Correct Answer:

  andyR 2 months ago


Site Recovery
Site Recovery and backup
Azure Back up only
upvoted 80 times

  airairo 1 month, 3 weeks ago


came in the exam last month.
upvoted 3 times

  StressiOlli 2 months ago


This is very correct.
upvoted 6 times

  Abbas 2 months ago


SALES - Azure Site Recovery Only
FINANCE - Azure Site Recovery and Azure Backup
REPORTING - Azure Backup Only
upvoted 17 times

  kiriexam 1 month, 3 weeks ago

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 131/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Yes, correct.
upvoted 4 times

  Nian 1 month, 4 weeks ago


Yes - correct
upvoted 2 times

  tmurfet 1 month, 3 weeks ago


What is correct?
upvoted 1 times

  kiriexam 1 month, 3 weeks ago


RTO: ASR & Azure Backup < Azure Site Recovery < Azure Backup
upvoted 1 times

  soujora2 1 month, 2 weeks ago


Sales: Azure Site Recovery only
Finance: Azure Site Recovery and Azure Backup
Reporting: Azure Backup only
upvoted 5 times

  wgre 1 month, 2 weeks ago


ASR can replicate between Datacenters
upvoted 1 times

  ArifUK 1 month ago


So, Can ASR fail-over to an 'on-prem' data center?
upvoted 1 times

  SAMBIT 2 weeks, 4 days ago


oh yes
upvoted 1 times

  joshyz73 4 weeks, 1 day ago


Yes, but as of 12/31/2020, it's no longer supported for VMware. It appears to still be supported for Hyper-V.
https://docs.microsoft.com/en-us/azure/site-recovery/vmware-physical-secondary-support-matrix
and
https://docs.microsoft.com/en-us/azure/site-recovery/hyper-v-vmm-secondary-support-matrix
upvoted 1 times

  Prateek120896 2 weeks, 5 days ago


Answer mentioned is wrong
upvoted 2 times

  i_am_gopinath 1 week, 5 days ago


do u mean wrong answer by examtopics or posted in the discussion group here?
As per my study, correct answer should be
ASR
ASR and backup
Azure Backup
upvoted 3 times

  EgorAivazov 1 week, 2 days ago


Moderators, the given answers are wrong. Please fix it.
Answers given by users andyR and Abbas are correct.
upvoted 1 times

  glam 1 week, 2 days ago


SALES - Azure Site Recovery Only
FINANCE - Azure Site Recovery and Azure Backup
REPORTING - Azure Backup Only
upvoted 4 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 132/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #14 Topic 4

You have an Azure Storage v2 account named storage1.


You plan to archive data to storage1.
You need to ensure that the archived data cannot be deleted for ve years. The solution must prevent administrators from deleting the data.
What should you do?

A. You create an Azure Blob storage container, and you con gure a legal hold access policy.

B. You create a le share and snapshots.

C. You create a le share, and you con gure an access policy.

D. You create an Azure Blob storage container, and you con gure a time-based retention policy and lock the policy.

Correct Answer: D
Time-based retention policy support: Users can set policies to store data for a speci ed interval. When a time-based retention policy is set,
blobs can be created and read, but not modi ed or deleted. After the retention period has expired, blobs can be deleted but not overwritten.
Note:
Immutable storage for Azure Blob storage enables users to store business-critical data objects in a WORM (Write Once, Read Many) state. This
state makes the data non-erasable and non-modi able for a user-speci ed interval. For the duration of the retention interval, blobs can be
created and read, but cannot be modi ed or deleted. Immutable storage is available for general-purpose v2 and Blob storage accounts in all
Azure regions.
Reference:
https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-immutable-storage
Design Business Continuity

  KumarPV 1 month, 3 weeks ago


Agreed. D is right answer
upvoted 13 times

  jhoomtv 1 month, 1 week ago


This contradicts Question set - Topic 4 - Question # 4
upvoted 1 times

  joshyz73 1 month ago


No it doesn't. That question says "you create a file share...". This one indicates a blob storage container.
upvoted 1 times

  fudu101 1 month, 1 week ago


https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-immutability-policies-manage?tabs=azure-portal
upvoted 1 times

  milind8451 1 week, 4 days ago


Right ans
upvoted 1 times

  glam 1 week, 2 days ago


D. You create an Azure Blob storage container, and you configure a time-based retention policy and lock the policy.
upvoted 1 times

Topic 5 - Question Set 5

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 133/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #1 Topic 5

You deploy two instances of an Azure web app. One instance is in the East US Azure region and the other instance is in the West US Azure region.
The web app uses Azure Blob storage to deliver large les to end users.
You need to recommend a solution for delivering the les to the users. The solution must meet the following requirements:
✑ Ensure that the users receive les from the same region as the web app that they access.
✑ Ensure that the les only need to be uploaded once.
✑ Minimize costs.
What should you include in the recommendation?

A. Distributed File System (DFS)

B. read-access geo-redundant storage (RA-GRS)

C. Azure File Sync

D. geo-redundant storage (GRS)

Correct Answer: B

  speedminer 5 months ago


https://docs.microsoft.com/en-us/azure/storage/common/storage-redundancy
upvoted 1 times

  umby 2 weeks, 2 days ago


The give answer is correct. Thanks Speedminer to share the relevant document "With GRS or GZRS, the data in the secondary region isn't
available for read or write access unless there is a failover to the secondary region. For read access to the secondary region, configure your
storage account to use read-access geo-redundant storage (RA-GRS) or read-access geo-zone-redundant storage (RA-GZRS)."
upvoted 2 times

  levianthan 4 months, 2 weeks ago


With GRS you don't get to read the other region unless you initiate failover. So RA-GRS.
upvoted 11 times

  cowodah814 3 months, 3 weeks ago


it does not achieve the question requirement:
Ensure that the users receive files from the same region as the web app that they access.
So C is the correct answer
upvoted 3 times

  cloudGala 3 months, 3 weeks ago


Question asking "Ensure that the users receive files from the same region as the web app that they access."
Correct answer B. read-access geo-redundant storage (RA-GRS)
upvoted 16 times

  Jinder 2 weeks, 3 days ago


BOTH GRS and RA_GRS are available only when failover occurs, otherwise, it's always the primary copy that is accessible. So both GRS and
RA_GRS do not hold true as answers.
Then DFS is not an Azure option.
The only left option is then Azure File Sync, but that is also requiring an Azure File Server.

Maybe incorrect options are given here mistakenly, similar cases have been seen while studying questions in this website, not many, just rarely.
Admin can clear the air on it.
upvoted 2 times

  MadEgg 2 weeks ago


That's not right Jinder. With your explanation there is no difference between GRS and RA_GRS.
See the link shared by speedminer:
https://docs.microsoft.com/en-us/azure/storage/common/storage-redundancy
"...When you enable read access to the secondary region, your data is available to be read at all times, including in a situation where the
primary region becomes unavailable. For read access to the secondary region, enable read-access geo-redundant storage (RA-GRS) or read-
access geo-zone-redundant storage (RA-GZRS)."
upvoted 2 times

  SHABS78 3 months, 1 week ago


It seems the should be GRS, as there is no condition mentioned for redundancy or requirements mentioned to be able to read information from
another region! Please advise guys?
upvoted 1 times

  Marshal_ 3 months ago

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 134/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Since each app needs to read from its own region and can only be uploaded once -- GRS to spread the info and RA to let the apps read it
without failover.
upvoted 2 times

  certmonster 3 months, 1 week ago


go with B.
upvoted 4 times

  Mohan007 3 months, 1 week ago


Answer is B.
Refer - https://docs.microsoft.com/en-us/azure/storage/common/storage-redundancy
With GRS or GZRS, the data in the secondary region isn't available for read or write access unless there is a failover to the secondary region. For
read access to the secondary region, configure your storage account to use read-access geo-redundant storage (RA-GRS) or read-access geo-
zone-redundant storage (RA-GZRS).
upvoted 9 times

  uzairahm007 1 month, 4 weeks ago


There is no requirement to read from secondary region at all in the question
upvoted 1 times

  Rayrichi 1 month, 2 weeks ago


It says "Ensure that the users receive files from the same region as the web app that they access."
upvoted 4 times

  kopper2019 1 month, 1 week ago


yes B is correct
upvoted 2 times

  rizabeer 2 weeks, 5 days ago


I m not sure if this is correct, my understanding is that the secondary region data is only available for read/write after failover either its GRS or in
case of RA-GRS it is available for Read access but no writes unless failover. From the question it does not look like the condition is only when it is
failover then clients need to access the data? we have to provide a fully accessible storage in either regions. Did I miss anything? or we are
assuming that one of the regions does not need to write to the files, did I miss that? But then I dont see any other options also satisfying the ask.
Anyone please able to clarify, will appreciate it.
upvoted 3 times

  MadEgg 2 weeks ago


In the question there are the requirements:
- upload only once (to which region is not important -> write access to the primary region)
- download from the same region as the web app that they access (the region is important -> read access from both primary and secondary
region)

=> RA_GRS
upvoted 2 times

  AIster77 2 weeks ago


I reckon that they keywords in this question are: large files, access files from same region as web app, minimize cost.

Based on this I would go with C - Azure Files Sync

In order to enable/create large file shares on Storage accounts (https://docs.microsoft.com/en-us/azure/storage/files/storage-files-how-to-create-


large-file-share), the restrictions are to use LRS or ZRS on a large file share-enabled StorageV2 (general purpose v2) account. So this rules out GRS
or RA-GRS in the choices. Also ticks off access files from the same region as web app.
Only Standard account supports this, not Premium (hence minimizing cost as well). When you enable this in the storage account setting, it further
notes this - "Large file share storage accounts do not have the ability to convert to geo-redundant storage offerings and upgrade is permanent.".
upvoted 2 times

  i_am_gopinath 1 week, 5 days ago


wrong RA-GRS is the correct
upvoted 1 times

  glam 1 week, 2 days ago


B. read-access geo-redundant storage (RA-GRS)
upvoted 2 times

  Granwizzard 2 days, 2 hours ago


Accordingly with this article is possible to read and write to primary or secondary regions with RA-GRS or RA-ZRS using a Storage Client Library.
https://docs.microsoft.com/en-us/azure/storage/common/geo-redundant-design.
Accordingly with the KB is B
Any thoughts?
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 135/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #2 Topic 5

You are developing a web application that provides streaming video to users. You con gure the application to use continuous integration and
deployment.
The app must be highly available and provide a continuous streaming experience for users.
You need to recommend a solution that allows the application to store data in a geographical location that is closest to the user.
What should you recommend?

A. Azure Content Delivery Network (CDN)

B. Azure Redis Cache

C. Azure App Service Web Apps

D. Azure App Service Isolated

Correct Answer: A
Azure Content Delivery Network (CDN) is a global CDN solution for delivering high-bandwidth content. It can be hosted in Azure or any other
location. With Azure
CDN, you can cache static objects loaded from Azure Blob storage, a web application, or any publicly accessible web server, by using the
closest point of presence (POP) server. Azure CDN can also accelerate dynamic content, which cannot be cached, by leveraging various
network and routing optimizations.
Reference:
https://docs.microsoft.com/en-in/azure/cdn/

  SHABS78 3 months, 1 week ago


Seems to a correct answer.
upvoted 2 times

  coolwhiz 3 months, 1 week ago


Correct. CDN for Geo availability
https://docs.microsoft.com/en-us/azure/cdn/cdn-overview
upvoted 5 times

  glam 1 week, 2 days ago


A. Azure Content Delivery Network (CDN)
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 136/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #3 Topic 5

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that
might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You need to deploy resources to host a stateless web app in an Azure subscription. The solution must meet the following requirements:
✑ Provide access to the full .NET framework.
✑ Provide redundancy if an Azure region fails.
✑ Grant administrators access to the operating system to install custom application dependencies.
Solution: You deploy a virtual machine scale set that uses autoscaling.
Does this meet the goal?

A. Yes

B. No

Correct Answer: B
Instead, you should deploy two Azure virtual machines to two Azure regions, and you create a Tra c Manager pro le.

  fesexit266 4 months, 2 weeks ago


VMSS only support 1 region
upvoted 11 times

  SHABS78 3 months, 1 week ago


I agree
upvoted 1 times

  Ram_s 1 month, 4 weeks ago


They are talking about stateless web app. So should we think of some containerization. So i guess answer is right but VM might not the best way to
go with.
upvoted 2 times

  andyR 1 month, 4 weeks ago


you can't have OS level access on scale sets to install dependencies
upvoted 1 times

  nesith 1 month, 2 weeks ago


Yes you can, requirement for scale set are Identical VMs, the clincher for this question is scale set does not give you region outage support
upvoted 2 times

  glam 1 week, 2 days ago


B. No ........
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 137/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #4 Topic 5

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that
might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You need to deploy resources to host a stateless web app in an Azure subscription. The solution must meet the following requirements:
✑ Provide access to the full .NET framework.
✑ Provide redundancy if an Azure region fails.
✑ Grant administrators access to the operating system to install custom application dependencies.
Solution: You deploy two Azure virtual machines to two Azure regions, and you deploy an Azure Application Gateway.
Does this meet the goal?

A. Yes

B. No

Correct Answer: B
You need to deploy two Azure virtual machines to two Azure regions, but also create a Tra c Manager pro le.

  Tombarc 5 months, 1 week ago


No is the right answer, App Gateway will balance the traffic between VMs deployed in the same region.
upvoted 10 times

  SHABS78 3 months, 1 week ago


I agree
upvoted 2 times

  fesexit266 4 months, 2 weeks ago


B is correct, Application Gateway are deployed in 1 region only
upvoted 3 times

  pentum7 3 months, 1 week ago


Azure Traffic Manager is a DNS-based traffic load balancer that enables you to distribute traffic optimally to services across global Azure regions,
while providing high availability and responsiveness.
upvoted 4 times

  TestUser911 2 months, 3 weeks ago


Application Gateway is Regional load-balancing service

https://docs.microsoft.com/en-us/azure/architecture/guide/technology-choices/load-balancing-overview
upvoted 4 times

  bipin24x7 3 weeks, 5 days ago


It does not support two different regions.
upvoted 1 times

  glam 1 week, 2 days ago


B. No .......
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 138/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #5 Topic 5

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that
might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You need to deploy resources to host a stateless web app in an Azure subscription. The solution must meet the following requirements:
✑ Provide access to the full .NET framework.
✑ Provide redundancy if an Azure region fails.
✑ Grant administrators access to the operating system to install custom application dependencies.
Solution: You deploy two Azure virtual machines to two Azure regions, and create a Tra c Manager pro le.
Does this meet the goal?

A. Yes

B. No

Correct Answer: A

  pentum7 3 months, 1 week ago


Correct
Azure Traffic Manager is a DNS-based traffic load balancer that enables you to distribute traffic optimally to services across global Azure regions,
while providing high availability and responsiveness.

https://docs.microsoft.com/en-us/azure/traffic-manager/traffic-manager-overview
upvoted 12 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 139/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #6 Topic 5

HOTSPOT -
You plan to deploy a network-intensive application to several Azure virtual machines.
You need to recommend a solution that meets the following requirements:
✑ Minimizes the use of the virtual machine processors to transfer data
✑ Minimizes network latency
Which virtual machine size and feature should you use? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:

Correct Answer:

Reference:
https://docs.microsoft.com/en-us/azure/virtual-machines/windows/sizes-hpc#h-series

  velasko 2 months ago


Correct Answer:
- Compute optimized Standard_F8s
- Single root I/O virtualization (SR-IOV)
upvoted 14 times

  EgorAivazov 1 week, 2 days ago


Agree with velasko.

Correct Answer:
- Compute optimized Standard_F8s
https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 140/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

- Single root I/O virtualization (SR-IOV)

Based on the Microsoft's documentation below:


https://docs.microsoft.com/en-us/azure/virtual-network/create-vm-accelerated-networking-cli

Explanation:
Accelerated networking enables single root I/O virtualization (SR-IOV) to a VM, greatly improving its networking performance. This high-
performance path bypasses the host from the datapath, *reducing latency*, jitter, and CPU utilization, for use with the most demanding network
workloads on supported VM types.

Supported VM instances
Accelerated Networking is supported on most general purpose and compute-optimized instance sizes with 2 or more vCPUs. These supported
series are: D/DSv2 and F/Fs

On instances that support hyperthreading, Accelerated Networking is supported on VM instances with 4 or more vCPUs. Supported series are:
D/Dsv3, D/Dsv4, Dd/Ddv4, Da/Dasv4, E/Esv3, E/Esv4, Ed/Edsv4, Ea/Easv4, Fsv2, Lsv2, Ms/Mms and Ms/Mmsv2.
upvoted 1 times

  EgorAivazov 2 days, 18 hours ago


Please disregard my answer. Accelerated networking (SR-IOV) won't meet the requirement. Coreect answer is:
1. High Performance Compute Standard H16r
2. RDMA

Reference:

Most of the HPC VM sizes (HBv2, HB, HC, *H16r*, H16mr, A8 and A9) feature a network interface for remote direct memory access (RDMA)
connectivity.
upvoted 1 times

  cpalacios 1 month, 4 weeks ago


So the answer shown is totally wrong? If so, can you explain why?
upvoted 1 times

  Jinder 2 weeks, 3 days ago


The correct answer is F series and SR-IOV.
upvoted 1 times

  uzairahm007 1 month, 4 weeks ago


Can you explain the reasoning please?
upvoted 1 times

  uzairahm007 1 month, 4 weeks ago


refer to https://www.examtopics.com/exams/microsoft/az-301/view/30/
upvoted 3 times

  tmurfet 1 month, 3 weeks ago


"Minimizes the use of the virtual machine processors that means RDMA, RDMA is supported by H class VMs and "minimize network latency +
network-intensive app" also means RDMA + H16r.
upvoted 6 times

  EgorAivazov 2 days, 18 hours ago


Totally agree!
upvoted 1 times

  19042020 1 month, 2 weeks ago


so what are the final answers
upvoted 1 times

  shapgi 1 month, 1 week ago


Tough question that requires knowing what VM instance size supports SR-IOV vs. RDMA. Given the selection of VM types and features available,
the only combination that would work is:
* High performance computer Standard_H16r
* Remote Direct Memory Access (RDMA)

While it's true that SR-IOV would meet the requirement to minimize network latency, SR-IOV is not supported in any of the available VM sizes.
Therefore, the right combo is what I listed above.

Reference:- https://www.examtopics.com/exams/microsoft/az-301/view/30/
upvoted 6 times

  ArifUK 1 month ago


MS Doc reads (for the H series) "H-series features 56 Gb/sec Mellanox FDR InfiniBand in a non-blocking fat tree configuration for consistent RDMA
performance."

note: InfiniBand (IB) is a computer networking communications standard used in high-performance computing that features very high throughput
and very low latency. It is used for data interconnect both among and within computers.

So, mention of 'InfiniBand (IB) and RDMA suggests the answer above is correct.
upvoted 2 times
https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 141/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

  ca 7787 2 weeks, 5 days ago


The Microsoft documentation mentions the below-supported series of Virtual Machines for the enabling Accelerated Networking.
These supported series are: D/DSv2 , F/Fs, D/Dsv3, E/Esv3, Fsv2, Lsv2, Ms/Mms and Ms/Mmsv2.

2- For Accelerated Networking, we need to enable “Single Root I/O Virtualization (SR-IOV)”
You are welcome. 100% sure about the answers.
upvoted 2 times

  nutouch 2 weeks, 1 day ago


There are two correct answers:
The given answer is correct and confirmed by the below link:
https://docs.microsoft.com/en-us/azure/virtual-machines/h-series?toc=/azure/virtual-machines/linux/toc.json&bc=/azure/virtual-
machines/linux/breadcrumb/toc.json
The above link shows that STandard_H16r supports RDMA

However Standard_E16s_v3 and SR-IOV combination also works:


In this link: https://docs.microsoft.com/en-us/azure/virtual-network/create-vm-accelerated-networking-cli
under the section of "supported VM instances" stating that Hyperthread enabled E/Esv3 instances with 4 or more CPUs support Accelerated
Networking. Supported by this link which shows that E16s_v3 supports hyperthreading: https://docs.microsoft.com/en-us/azure/virtual-
machines/ev3-esv3-series
upvoted 1 times

  milind8451 1 week, 4 days ago


To minimize the use of the virtual machine processors to transfer network data, you need VMs who support accelerated Networking and to enable
this you need to enable Single Root I/O Virtualization (SR-IOV) on VM. Accelerated networking is supported on most general purpose and
compute-optimized instance sizes with two or more virtual CPUs (vCPUs). These supported series are: Dv2/DSv2 and F/Fs.

Correct Ans - Compute optimized Standard_F8s and SR-IOV

https://docs.microsoft.com/en-us/azure/virtual-network/create-vm-accelerated-networking-powershell
upvoted 2 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 142/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #7 Topic 5

You need to recommend a solution to deploy containers that run an application. The application has two tiers. Each tier is implemented as a
separate Docker
Linux-based image. The solution must meet the following requirements:
✑ The front-end tier must be accessible by using a public IP address on port 80.
✑ The backend tier must be accessible by using port 8080 from the front-end tier only.
✑ Both containers must be able to access the same Azure le share.
✑ If a container fails, the application must restart automatically.
✑ Costs must be minimized.
What should you recommend using to host the application?

A. Azure Kubernetes Service (AKS)

B. Azure Service Fabric

C. Azure Container instances

Correct Answer: C
Azure Container Instances enables a layered approach to orchestration, providing all of the scheduling and management capabilities required to
run a single container, while allowing orchestrator platforms to manage multi-container tasks on top of it.
Because the underlying infrastructure for container instances is managed by Azure, an orchestrator platform does not need to concern itself
with nding an appropriate host machine on which to run a single container.
Azure Container Instances can schedule both Windows and Linux containers with the same API.
Orchestration of container instances exclusively
Because they start quickly and bill by the second, an environment based exclusively on Azure Container Instances offers the fastest way to get
started and to deal with highly variable workloads.
Reference:
https://docs.microsoft.com/en-us/azure/container-instances/container-instances-overview https://docs.microsoft.com/en-us/azure/container-
instances/container-instances-orchestrator-relationship

  MaoLi 4 months, 3 weeks ago


“Azure Container Service will be retired on January 31, 2020, and is no longer recommended for new resources." go with AKS
upvoted 3 times

  StressiOlli 3 months ago


You have no clue.
upvoted 6 times

  Florent44 4 months, 2 weeks ago


Azure Container Instance is not the same thing as Azure Container Service so in this case the answer is ACI with container group :
https://docs.microsoft.com/en-us/azure/container-instances/container-instances-container-groups
upvoted 10 times

  Az_Sasi 4 months, 2 weeks ago


Why not AKS?
https://docs.microsoft.com/en-us/azure/container-instances/container-instances-overview
For scenarios where you need full container orchestration, including service discovery across multiple containers, automatic scaling, and
coordinated application upgrades, we recommend Azure Kubernetes Service (AKS).
upvoted 3 times

  ArifUK 1 month ago


says 'Cost must be minimized'. besides, the bells and whistles that come with AKS is not a requirement in the given scenario. ACI is just fine.
upvoted 2 times

  Ragdoll 4 months, 2 weeks ago


ACI is cheaper than AKS. In this case, both solutions can be used technically, but cost efficiency has to be considered as well.
upvoted 6 times

  Tos_123 4 months ago


A
Based on Microsoft AKS is the recommended solution for Linux containers
https://docs.microsoft.com/en-us/dotnet/architecture/modernize-with-azure-containers/modernize-existing-apps-to-cloud-optimized/choosing-
azure-compute-options-for-container-based-applications
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 143/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

  Remco 4 months ago


I think Azur Container Instances is the better choice here. Also based on the article you are referring to. The deciding factor is costs.
If there was some kind of elasticity in the question then I would agree with you. The article states that AKS is recommended for LINUX in case of
micro services and that means having elasticity.
upvoted 2 times

  SHABS78 3 months, 1 week ago


I believe it could be ACI
upvoted 2 times

  manderda 4 months ago


Why is Azure Service Fabric not possible?
upvoted 1 times

  ForYEO 2 months, 2 weeks ago


pricy, same as AKS
upvoted 2 times

  tmurfet 3 months, 1 week ago


Example points the way: https://docs.microsoft.com/en-us/azure/container-instances/container-instances-container-groups
upvoted 3 times

  BoxMan 2 months, 3 weeks ago


Pretty certain it is correct with C. The item about "restart automatically" is what AKS/Orchestrators do and ACI doesn't.

"Azure Container Instances enables a layered approach to orchestration, providing all of the scheduling and management capabilities required to
run a single container, while allowing orchestrator platforms to manage multi-container tasks on top of it." from:

https://docs.microsoft.com/en-us/azure/container-instances/container-instances-orchestrator-relationship
upvoted 1 times

  BoxMan 2 months, 3 weeks ago


A not C, my bad. AKS.
upvoted 1 times

  RandomUser 2 months, 2 weeks ago


"If a container fails, the application must restart automatically." Not sure this requirement makes sense when taken literally, but in that case
we'd reach a scenario that ACI can't handle. (Not sure AKS can but I see more chances over there.)
upvoted 1 times

  arseyam 2 months, 1 week ago


ACI supports restart policy (by default is set to "OnFailure").
https://docs.microsoft.com/en-us/azure/container-instances/container-instances-restart-policy
upvoted 2 times

  RandomUser 2 months, 2 weeks ago


The documentation likes to contradict:

https://docs.microsoft.com/de-de/azure/container-instances/container-instances-restart-policy

"When you create a container group in Azure Container Instances, you can specify one of three restart policy settings."
Always , Never, OnFailure
upvoted 3 times

  arseyam 2 months, 1 week ago


Azure Container Instances is the correct answer.
It has networking controls (public, private) and supports Restart policies set to "OnFailure" by default.
upvoted 5 times

  gcpjay 1 month, 3 weeks ago


The given answer is correct, but the reasoning is inadequate. We can have group both the container in an Azure Container Instance group and both
can access the same Azure File share.

https://docs.microsoft.com/en-us/azure/container-instances/container-instances-container-groups
upvoted 2 times

  jhoomtv 1 month, 1 week ago


Given Answer is correct:
https://docs.microsoft.com/en-us/azure/container-instances/container-instances-overview
Persistent storage;
To retrieve and persist state with Azure Container Instances, we offer direct mounting of Azure Files shares backed by Azure Storage.
Secondly has restart policy;
https://docs.microsoft.com/en-us/azure/container-instances/container-instances-restart-policy
upvoted 1 times

  ArifUK 1 month ago


if anyone is interested, here's a comparison between ACI And AKS.

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 144/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

upvoted 1 times

  milind8451 1 week, 4 days ago


https://tutorialsdojo.com/azure-container-instances-aci-vs-azure-kubernetes-service-aks/
upvoted 1 times

  MumbaiIndians 1 month ago


bhai link toh dalta ??
upvoted 2 times

Question #8 Topic 5

You architect a solution that calculates 3D geometry from height-map data.


You have the following requirements:
Perform calculations in Azure.

✑ Each node must communicate data to every other node.


✑ Maximize the number of nodes to calculate multiple scenes as fast as possible.
✑ Require the least amount of effort to implement.
You need to recommend a solution.
Which two actions should you recommend? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

A. Create a render farm that uses Azure Batch.

B. Create a render farm that uses virtual machines (VMs).

C. Enable parallel task execution on compute nodes.

D. Create a render farm that uses virtual machine (VM) scale sets.

E. Enable parallel le systems on Azure.

Correct Answer: AC

  andyR 2 months ago


correct
upvoted 1 times

  idris 2 months ago


Correct. These are all features of Azure Batch
https://docs.microsoft.com/en-us/azure/batch/batch-technical-overview
upvoted 2 times

  MadEgg 2 weeks ago


That's right. Thanks for the link.

At first I thought that in batch the nodes can't communicate with each other. But the page you share states that it is possible. They are called
"tightly coupled workloads".
upvoted 1 times

  buanilk 2 weeks, 3 days ago


both A & C are using rendering farm. I would say Rendering farm using azure batch & then enable parallel task.
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 145/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #9 Topic 5

Your company plans to publish APIs for its services by using Azure API Management.
You discover that service responses include the AspNet-Version header.
You need to recommend a solution to remove AspNet-Version from the response of the published APIs.
What should you include in the recommendation?

A. a new product

B. a modi cation to the URL scheme

C. a new policy

D. a new revision

Correct Answer: C
Set a new transformation policy to transform an API to strip response headers.
Reference:
https://docs.microsoft.com/en-us/azure/api-management/transform-api

  speedminer 5 months ago


Seems accurate, link is a tutorial that describes how do to what is asked in the question.

In this tutorial, you learn how to:

Transform an API to strip response headers


Replace original URLs in the body of the API response with APIM gateway URLs
Protect an API by adding rate limit policy (throttling)
Test the transformations
upvoted 8 times

  GenXCoder 1 month, 4 weeks ago


But it is not a policy. It is a transformation in APIM.
upvoted 1 times

  pentum7 3 months, 1 week ago


Correct
upvoted 1 times

  rizabeer 2 weeks, 4 days ago


@GenXCoder It is a Transformation policy so yes the answer is correct. Here is the link for your reference; https://docs.microsoft.com/en-
us/azure/api-management/transform-api#set-the-transformation-policy
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 146/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #10 Topic 5

You have an Azure subscription that contains a storage account.


An application sometimes writes duplicate les to the storage account.
You have a PowerShell script that identi es and deletes duplicate les in the storage account. Currently, the script is run manually after approval
from the operations manager.
You need to recommend a serverless solution that performs the following actions:
✑ Runs the script once an hour to identify whether duplicate les exist
✑ Sends an email noti cation to the operations manager requesting approval to delete the duplicate les
✑ Processes an email response from the operations manager specifying whether the deletion was approved
✑ Runs the script if the deletion was approved
What should you include in the recommendation?

A. Azure Logic Apps and Azure Functions

B. Azure Pipelines and Azure Service Fabric

C. Azure Logic Apps and Azure Event Grid

D. Azure Functions and Azure Batch

Correct Answer: A
You can schedule a powershell script with Azure Logic Apps.
When you want to run code that performs a speci c job in your logic apps, you can create your own function by using Azure Functions. This
service helps you create Node.js, C#, and F# functions so you don't have to build a complete app or infrastructure to run code. You can also call
logic apps from inside Azure functions. Azure Functions provides serverless computing in the cloud and is useful for performing tasks such as
these examples:
Reference:
https://docs.microsoft.com/en-us/azure/logic-apps/logic-apps-azure-functions

  mchint01 4 months ago


correct
upvoted 4 times

  AhmedAL 3 months ago


not sure about the answer. Its a PowerShell script that already exists. I don't see the need for new coding.Wouldn't C complete the solution without
new funtions
upvoted 2 times

  magazin80 3 months ago


Could it be because we need to process the manager's response?
upvoted 2 times

  pentum7 2 months, 4 weeks ago


Yes, since the PS script only does one part of the whole workflow "PowerShell script that identifies and deletes duplicate files in the storage
account". We do need to code to process the manager's response still
upvoted 2 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 147/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #11 Topic 5

DRAG DROP -
You have an on-premises network that uses an IP address space of 172.16.0.0/16.
You plan to deploy 25 virtual machines to a new Azure subscription.
You identify the following technical requirements:
✑ All Azure virtual machines must be placed on the same subnet named Subnet1.
✑ All the Azure virtual machines must be able to communicate with all on-premises servers.
✑ The servers must be able to communicate between the on-premises network and Azure by using a site-to-site VPN.
You need to recommend a subnet design that meets the technical requirements.
What should you include in the recommendation? To answer, drag the appropriate network addresses to the correct subnets. Each network
address may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
Select and Place:

Correct Answer:

  andyR 2 months ago


correct
upvoted 3 times

  uzairahm007 1 month, 4 weeks ago


Answer is perfectly correct. " /28" is for the gateway subnet (recommended by Microsoft) and VMs are NOT INSTALLED in gateway subnet because
in that case they don't remain VMs, they become VPN Gateway Instances. On-premise subnet range is 172.16.0.0/16 hence this address space
CAN'T be used for VMs so remaining is 192.168.0.0/24 subnet range for VMs and 192.168.1.0/28 for gateway subnet. Hence answer is correct.
upvoted 4 times

  Elecktrus 1 month, 4 weeks ago


is Correct. Explanation (for people that don`t know about network)
1) The range for the new subnet can't overlap the on-premise subnet range. The on-premise network is 172.16.0.0/16, that is from 172.16.0.1 to
172.16.255.255, so the answers 172.16.0.0/16 and 172.16.1.0/28 are not valid (overlap with on-pemise subnet)
2) the range 192.168.1.0/28 is from 192.168.1.1 to 192.168.1.15, only 16 ips and we need 25 IPs, so the only valid answer for subnet1 is
192.168.0.0/24
3) the range for the gateway can't ovelap with on-premise, and Microsoft recommend that would be /27 or /28, so the answer valid for gateway is
192.168.1.0/28

Check the example: https://docs.microsoft.com/es-es/azure/vpn-gateway/tutorial-site-to-site-portal


upvoted 12 times

  jonasis 1 week, 5 days ago


Correct answer, but I just want to add that Microsoft and John Savill recommends /27 mask for GW subnet
https://docs.microsoft.com/en-us/azure/vpn-gateway/vpn-gateway-about-vpn-gateway-settings
upvoted 2 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 148/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

  Hfyr 2 days, 16 hours ago


Correct - and John Savill rules! :)
upvoted 1 times

Question #12 Topic 5

You are designing an Azure solution.


The network tra c for the solution must be securely distributed by providing the following features:
✑ HTTPS protocol
✑ Round robin routing
✑ SSL o oading
You need to recommend a load balancing option.
What should you recommend?

A. Azure Load Balancer

B. Azure Internal Load Balancer (ILB)

C. Azure Tra c Manager

D. Azure Application Gateway

Correct Answer: D
If you are looking for Transport Layer Security (TLS) protocol termination ("SSL o oad") or per-HTTP/HTTPS request, application-layer
processing, review
Application Gateway.
Application Gateway is a layer 7 load balancer, which means it works only with web tra c (HTTP, HTTPS, WebSocket, and HTTP/2). It supports
capabilities such as SSL termination, cookie-based session a nity, and round robin for load-balancing tra c. Load Balancer load-balances
tra c at layer 4 (TCP or UDP).
Reference:
https://docs.microsoft.com/en-us/azure/application-gateway/application-gateway-faq

  emphasis 3 months, 3 weeks ago


the answer is correct
upvoted 12 times

  IvanDan 1 month, 3 weeks ago


Thanks for emphasizing the correct answer :)
upvoted 5 times

  bingomutant 1 month, 1 week ago


i would like to add emphasis also
upvoted 4 times

  milind8451 1 week, 4 days ago


Right ans. Azure LB doesn't supports SSL offloading but App GW does.
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 149/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #13 Topic 5

Your company, named Contoso, Ltd, implements several Azure logic apps that have HTTP triggers: The logic apps provide access to an on-
premises web service.
Contoso establishes a partnership with another company named Fabrikam, Inc.
Fabrikam does not have an existing Azure Active Directory (Azure AD) tenant and uses third-party OAuth 2.0 identity management to authenticate
its users.
Developers at Fabrikam plan to use a subset of the logics apps to build applications that will integrate with the on-premises web service of
Contoso.
You need to design a solution to provide the Fabrikam developers with access to the logic apps. The solution must meet the following
requirements:
✑ Requests to the logic apps from the developers must be limited to lower rates than the requests from the users at Contoso.
✑ The developers must be able to rely on their existing OAuth 2.0 provider to gain access to the logic apps.
✑ The solution must NOT require changes to the logic apps.
✑ The solution must NOT use Azure AD guest accounts.
What should you include in the solution?

A. Azure AD business-to-business (B2B)

B. Azure Front Door

C. Azure API Management

D. Azure AD Application Proxy

Correct Answer: C
API Management helps organizations publish APIs to external, partner, and internal developers to unlock the potential of their data and
services.
You can secure API Management using the OAuth 2.0 client credentials ow.
Incorrect Answers:
A: Azure Active Directory B2B uses guest users.
B: Azure Front Door is an Application Delivery Network (ADN) as a service, offering various layer 7 load-balancing capabilities for your
applications.
Azure Front Door supports HTTP, HTTPS and HTTP/2.
Applications can be authorized through OAuth 2.0.
D: Application Proxy is a feature of Azure AD that enables users to access on-premises web applications from a remote client. Application
Proxy includes both the
Application Proxy service which runs in the cloud, and the Application Proxy connector which runs on an on-premises server.
Application Proxy works with:
✑ Web applications that use Integrated Windows Authentication for authentication
Web applications that use form-based or header-based access

Reference:
https://docs.microsoft.com/en-us/azure/api-management/api-management-key-concepts

  andyR 2 months ago


correct
upvoted 7 times

  vladodias 2 days, 18 hours ago


sounds correct to me
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 150/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #14 Topic 5

You need to design a solution that will execute custom C# code in response to an event routed to Azure Event Grid. The solution must meet the
following requirements:
✑ The executed code must be able to access the private IP address of a Microsoft SQL Server instance that runs on an Azure virtual machine.
✑ Costs must be minimized.
What should you include in the solution?

A. Azure Logic Apps in the integrated service environment

B. Azure Functions in the Dedicated plan and the Basic Azure App Service plan

C. Azure Logic Apps in the Consumption plan

D. Azure Functions in the Consumption plan

Correct Answer: D
When you create a function app in Azure, you must choose a hosting plan for your app. There are three basic hosting plans available for Azure
Functions:
Consumption plan, Premium plan, and Dedicated (App Service) plan.
For the Consumption plan, you don't have to pay for idle VMs or reserve capacity in advance.
Connect to private endpoints with Azure Functions
As enterprises continue to adopt serverless (and Platform-as-a-Service, or PaaS) solutions, they often need a way to integrate with existing
resources on a virtual network. These existing resources could be databases, le storage, message queues or event streams, or REST APIs.
Reference:
https://docs.microsoft.com/en-us/azure/azure-functions/functions-scale https://techcommunity.microsoft.com/t5/azure-functions/connect-to-
private-endpoints-with-azure-functions/ba-p/1426615

  Tombarc 5 months, 1 week ago


A is the correct answer, Consumption plan doesn't support vNet integration.

ref: https://docs.microsoft.com/en-us/azure/azure-functions/functions-networking-options#matrix-of-networking-features
upvoted 10 times

  AhmedAL 3 months ago


How do you add C# code to a Logic App. I think its needs to be a function app. You cal also chose a webapp but not sure if you need it in this
case
upvoted 1 times

  Lexa 3 weeks, 5 days ago


You can call Azure Functions from Logic Apps. The answers are not the whole solutions, the question is "What should you include in the
solution?".
upvoted 1 times

  picaso123 4 months, 3 weeks ago


Yes A seems the correct answer as Basic App Service Plan also doesn't support VNet Connectivity, its strange though bcz the other 3 options apart
from A don't support VNet Connectivity, and option A is quite expensive, having an ISE will set you back 10's of thousand of dollars in a year and
question asks for a minimal cost solution, maybe they meant to say 'Standard App Service Plan' instead of Basic but anyways as per the options
only A seems correct
upvoted 5 times

  RandomUser 2 months, 2 weeks ago


A - you will need a Az Function on top of the LA to run custom C# code. A Az Fn connector is available for LAs.
B - this is a bummer: the Dedicated plan sure allows VNet integration and Az Fns will do the "custom C#" part but the "Basic" plan does not
allow for VNet integration. Did the question writer have a brain fart here?
C - No due to consumption plan.
D - No due to consumption plan. On top, you'd need an Az Function in the first place for the C# code.

Not sure what to make of this.


upvoted 3 times

  azure_kai 4 months, 2 weeks ago


Think B is the correct answer. If your app is in App Service Environment, then it's already in a VNet and doesn't require use of the VNet Integration
feature to reach resources in the same VNet.
https://docs.microsoft.com/en-us/azure/app-service/web-sites-integrate-with-vnet
upvoted 3 times

  Marang73 4 months ago

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 151/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Basic Azure App Service plan doesn't support vnet integration so answer must be A. Nothing is mentioned about App Service Environment.
upvoted 4 times

  arseyam 2 months, 1 week ago


You are missing on a "dedicated app service plan" - The answer is B for sure.
https://docs.microsoft.com/en-us/azure/azure-functions/functions-scale#app-service-plan
https://docs.microsoft.com/en-us/azure/app-service/overview-hosting-plans
upvoted 1 times

  arseyam 2 months, 1 week ago


Sorry, I mixed between dedicated and isolated.
upvoted 1 times

  ingoo 4 months, 2 weeks ago


but A says logc apps, we need an Azure Function, so B is correct, Azure function hosted on a App Service Plan Basic minimize costs and can access
the private IP
upvoted 6 times

  Mopseri 4 months, 2 weeks ago


Dedicated plan support Vnet integration, so B correct:
https://docs.microsoft.com/en-us/azure/azure-functions/functions-scale#networking-features
upvoted 2 times

  manderda 4 months ago


But the Basic App Service plan does not support Vnet, so it is still wrong.
https://azure.microsoft.com/en-us/pricing/details/app-service/windows/
upvoted 1 times

  arseyam 2 months, 1 week ago


Don't mix between Basic App Service plan and "Dedicated" Basic App Service plan
upvoted 1 times

  mchint01 4 months ago


B is correct. We need Az Functions along with dedicated plan - so private IP is reachable.
upvoted 3 times

  Yamaneko 3 months, 2 weeks ago


https://docs.microsoft.com/en-us/azure/azure-functions/functions-scale#app-service-plan
upvoted 1 times

  mchint01 4 months ago


I will take it back. Seems D is correct, cost wise
upvoted 2 times

  Vippsy 3 months, 2 weeks ago


Answer is B.

To connect to Azure virtual machines using private IP address, we must do Virtual network integration.
Virtual network integration is supported only on Azure Functions Dedicated plan and Azure Logics Apps ISE.
Azure Functions dedicated plan is cheaper than Azure Logic Apps ISE.
https://docs.microsoft.com/en-us/azure/azure-functions/functions-scale#hosting-plans-comparison
upvoted 6 times

  RandomUser 2 months, 2 weeks ago


Yeah, but then option B also has the "Basic" plan which does *not* allow VNet integration...
upvoted 1 times

  uzairahm007 1 month, 4 weeks ago


you would not even have to to use Azure App Service Plan as Azure Functions Dedicated Plan fullfill all requirements but better option is D
check out alekam comments below
upvoted 2 times

  trster 3 months, 2 weeks ago


The text quoted in the explanation "As enterprises continue to adopt serverless" comes from this site
https://www.michaelscollier.com/azure-functions-with-private-endpoints/
where they clearly state you need Functions Premium for private endpoints
upvoted 1 times

  testermar 3 months, 2 weeks ago


but guys you cannot execute custom C# code with logic app !
upvoted 1 times

  RandomUser 2 months, 2 weeks ago


You can. There is a (don't laugh) Azure Functions connector that allows you to run C# code. Don't shoot the messenger.
upvoted 1 times

  temporal111 3 months, 2 weeks ago


https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 152/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Correct

Private site access is available in the Premium, Consumption, and App Service plans when service endpoints are configured.
Service endpoints can be configured on a per-app basis under Platform features > Networking > Configure Access Restrictions > Add Rule. Virtual
networks can now be selected as a rule type.
For more information, see Virtual network service endpoints.

Links:

https://docs.microsoft.com/en-us/azure/azure-functions/functions-networking-options#matrix-of-networking-features

https://docs.microsoft.com/en-us/azure/azure-functions/functions-networking-options#inbound-ip-restrictions
upvoted 4 times

  tmurfet 3 months, 1 week ago


Agreed , D is correct. The following link explains how to create a private link and use consumption plan:
https://docs.microsoft.com/en-us/azure/azure-functions/functions-create-private-site-access
upvoted 4 times

  alekam 3 months ago


On this page the private site access is from a virtual network to the function. "By using private site access, you can require that your function
code is only triggered from a specific virtual network."

But in this case we need access from the Function to a VM in a VNet and there it clearly says: "If a Functions app needs to access Azure
resources within the virtual network, or connected via service endpoints, then virtual network integration is needed."

And if you look at https://docs.microsoft.com/en-us/azure/azure-functions/functions-create-vnet, at the very beginning it says "Create a


function app in the Premium plan". Hence it can't be D, I would say that B is correct.
upvoted 4 times

  gcpjay 1 month, 3 weeks ago


Thanks for your comments. It really helps answer the question. The correct answer is B as it meets the requirement of VNet integration
with Azure Functions and also cheaper as compared to other options.
upvoted 2 times

  tmurfet 2 months, 3 weeks ago


Alekam I agree with you -- I take back my response. Cannot be D as no network integration. Cannot be B either as Basic Azure App
Service plan also doesn't support network integration. That only leaves one option that can work: Has to be A but with Logic App calling
Function to execute code. Of course that answer isn't actually provided.
upvoted 5 times

  BoxMan 2 months, 3 weeks ago


Correct, D for me. It is the simplest and cheapest option based on the need for only one operation i.e. not a Logic App.

Here’s a recent comparison article:

https://walkerscott.co/2020/03/azure-logic-apps-vs-azure-functions/
upvoted 4 times

  jhoomtv 1 month, 1 week ago


Access restrictions are for inbound traffic only.
upvoted 1 times

  canon123 2 months, 1 week ago


it's D
https://docs.microsoft.com/en-us/azure/azure-functions/functions-create-private-site-access
upvoted 6 times

  tmurfet 1 month, 3 weeks ago


Not D -- see alekams comments. private site access only provides function inbound access, question requires outbound to SQL VM instance.
None of the answers are correct.
upvoted 2 times

  tmurfet 1 month, 3 weeks ago


IMO here's the answer as it ought to be stated:
B. Azure Functions in the Dedicated plan and the Standard Azure App Service plan.
upvoted 3 times

  jhoomtv 1 month, 1 week ago


A is correct, https://docs.microsoft.com/en-us/azure/logic-apps/connect-virtual-network-vnet-isolated-environment-overview
upvoted 1 times

  buanilk 2 weeks, 4 days ago


private endpoint connections r available in premium and app service plans so B is correct
upvoted 1 times

  milind8451 1 week, 4 days ago


I would have choosen D.

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 153/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

upvoted 1 times

  vladodias 2 days, 17 hours ago


Tricky... I think it is B, but I don't understand why the "Basic Azure App Service plan" was included??? Can someone please clarify???
A - Cannot be Logic Apps because it is C# code
B - Functions in the Dedicated Plan support VNET integration and Event Grid triggers
C - Cannot be Logic Apps because it is C# code
D - Functions don't support VNET integration in the Consumption plan
upvoted 1 times

  Niro 2 days, 14 hours ago


Answer seems to be correct on assumption we choose the cost effective and only need function to server direction connectivity.
https://docs.microsoft.com/en-us/azure/azure-functions/functions-networking-options
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 154/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #15 Topic 5

The developers at your company are building a containerized Python Django app.
You need to recommend platform to host the app. The solution must meet the following requirements:
✑ Support autoscaling.
✑ Support continuous deployment from an Azure Container Registry.
✑ Provide built-in functionality to authenticate app users by using Azure Active Directory (Azure AD).
Which platform should you include in the recommendation?

A. Azure Container instances

B. an Azure App Service instance that uses containers

C. Azure Kubernetes Service (AKS)

Correct Answer: C
To keep up with application demands in Azure Kubernetes Service (AKS), you may need to adjust the number of nodes that run your workloads.
The cluster autoscaler component can watch for pods in your cluster that can't be scheduled because of resource constraints. When issues are
detected, the number of nodes in a node pool is increased to meet the application demand.
Azure Container Registry is a private registry for hosting container images. It integrates well with orchestrators like Azure Container Service,
including Docker
Swarm, DC/OS, and the new Azure Kubernetes service. Moreover, ACR provides capabilities such as Azure Active Directory-based
authentication, webhook support, and delete operations.
Reference:
https://docs.microsoft.com/en-us/azure/aks/cluster-autoscaler https://medium.com/velotio-perspectives/continuous-deployment-with-azure-
kubernetes-service-azure-container-registry-jenkins-ca337940151b

  picaso123 4 months, 3 weeks ago


Shouldn't it be B? as that only supports Application Azure AD authentication without writing any code in the application itself, don't think AKS
supports that and not sure about Azure Container Instances
upvoted 14 times

  Rickii 4 months, 3 weeks ago


it does support. Answer seems correct
upvoted 4 times

  BoxMan 2 months, 3 weeks ago


https://docs.microsoft.com/en-us/azure/aks/managed-aad
upvoted 1 times

  Florent44 4 months, 2 weeks ago


No because it support Azure Active Directory for authentication of users and administrator who manage the AKS not for the end users of the
application.
upvoted 4 times

  levianthan 4 months, 2 weeks ago


Indeed, and WebApps do support User auth through AAD and other IDPs
upvoted 2 times

  ingoo 4 months, 2 weeks ago


B is correct because the managed authentication
upvoted 5 times

  Ragdoll 4 months, 2 weeks ago


B, for sure.
upvoted 3 times

  fesexit266 4 months, 2 weeks ago


https://docs.microsoft.com/en-us/azure/app-service/deploy-ci-cd-custom-container

Should be B
upvoted 2 times

  Tos_123 4 months ago


B
For web applications Azure App is the recommended solution. It supports AAD authentication as well.

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 155/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

https://docs.microsoft.com/en-us/dotnet/architecture/modernize-with-azure-containers/modernize-existing-apps-to-cloud-optimized/choosing-
azure-compute-options-for-container-based-applications
upvoted 1 times

  jasonaw20 3 months, 2 weeks ago


I don't think it provides built-in "user authentication" with AAD? So answer should be C.
upvoted 2 times

  mchint01 4 months ago


B correct
upvoted 2 times

  ufuomaolori 3 months, 2 weeks ago


C seems correct.
https://docs.microsoft.com/en-us/azure/aks/intro-kubernetes
upvoted 6 times

  Sixty 3 months, 1 week ago


B looks correct

Dev blog running Python Django on Web Apps https://devblogs.microsoft.com/premier-developer/running-django-on-azure-web-apps-for-


containers-with-docker/

Azure App Service supports AAD authentication https://docs.microsoft.com/en-us/azure/app-service/configure-authentication-provider-aad

Supports CD https://docs.microsoft.com/en-us/azure/app-service/deploy-continuous-deployment

Supports Autoscale https://docs.microsoft.com/en-us/azure/app-service/environment/app-service-environment-auto-scale


upvoted 8 times

  danarts6112 2 months, 3 weeks ago


Seems answer is correct.
Reference link: https://docs.microsoft.com/en-us/azure/aks/intro-kubernetes#identity-and-security-management
upvoted 1 times

  nexnexnex 3 weeks, 1 day ago


link describes access to resources, not to deployed apps
upvoted 1 times

  ForYEO 2 months, 2 weeks ago


AKS for single app? Isn't that overkill?
AppService is lightweight and pay as you go. CI + Container is supported as well as AAD integration
upvoted 2 times

  RandomUser 2 months, 2 weeks ago


Also, AKS can grant access to *container* based on AAD but is this already an app auth? B seems better.
upvoted 2 times

  salehz 1 month, 3 weeks ago


B correct
upvoted 3 times

  milind8451 1 week, 4 days ago


B) Azure App Services looks correct as it can be integrated with AAD for app level authentication while AKS can also be integrated with AAD but for
components of Kubernetes not the app itself.
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 156/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #16 Topic 5

You have an on-premises network to which you deploy a virtual appliance.


You plan to deploy several Azure virtual machines and connect the on-premises network to Azure by using a Site-to-Site connection.
All network tra c that will be directed from the Azure virtual machines to a speci c subnet must ow through the virtual appliance.
You need to recommend solutions to manage network tra c.
Which two options should you recommend? Each correct answer presents a complete solution.

A. Con gure Azure Tra c Manager.

B. Implement Azure ExpressRoute.

C. Con gure a routing table.

D. Implement an Azure virtual network.

Correct Answer: BC
B: Forced tunneling lets you redirect or "force" all Internet-bound tra c back to your on-premises location via a Site-to-Site VPN tunnel for
inspection and auditing.
This is a critical security requirement for most enterprise IT policies. Without forced tunneling, Internet-bound tra c from your VMs in Azure
always traverses from
Azure network infrastructure directly out to the Internet, without the option to allow you to inspect or audit the tra c.
Forced tunneling in Azure is con gured via virtual network user-de ned routes.
C: ExpressRoute lets you extend your on-premises networks into the Microsoft cloud over a private connection facilitated by a connectivity
provider. With
ExpressRoute, you can establish connections to Microsoft cloud services, such as Microsoft Azure, O ce 365, and Dynamics 365.
Connectivity can be from an any-to-any (IP VPN) network, a point-to-point Ethernet network, or a virtual cross-connection through a connectivity
provider at a co- location facility. ExpressRoute connections do not go over the public Internet. This allows ExpressRoute connections to offer
more reliability, faster speeds, lower latencies, and higher security than typical connections over the Internet.
Reference:
https://docs.microsoft.com/en-us/azure/vpn-gateway/vpn-gateway-forced-tunneling-rm https://docs.microsoft.com/en-
us/azure/expressroute/expressroute-introduction

  speedminer 5 months ago


Kind of a weirdly worded question....
Azure Express route fulfills the site to site connection piece....
Directing traffic using User Defined routes in a routing table is another piece...
But you'd need a virtual network too so it's a bit strange...
upvoted 1 times

  Jinder 2 weeks, 3 days ago


C and D. Not express route.
upvoted 2 times

  pentum7 2 months, 4 weeks ago


"You need to recommend solutions to manage network traffic." so VNet is really just the source or destination of the network traffic we need to
manage?
upvoted 1 times

  RandomUser 2 months, 2 weeks ago


And "each answer presents a complete solution"... Weird
upvoted 1 times

  pj11 4 months, 2 weeks ago


ExpressRoute is an option but not a necessity.

Correct Answer - C & D


upvoted 14 times

  beedle 2 months, 1 week ago


D DOES NOT MAKE ANY SENSE. If you follow this guy say bye bye to $$$
upvoted 1 times

  BoxMan 2 months, 3 weeks ago


After eliminating A and B as Traffic Manager is irrelevant and an Express Route isn't needed as they state you have VPN it's C and D.

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 157/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

C and D are also correct as you manage the traffic by directing it through the VPN and to the vNET. Not really super neat but understandable.
upvoted 2 times

  levianthan 4 months, 2 weeks ago


How does D route your traffic through an NVA? Not to mention that D is kind of required for all options.
upvoted 2 times

  ingoo 4 months, 2 weeks ago


it is B or C&D
upvoted 1 times

  bc5468521 3 months, 3 weeks ago


the question asks "Each correct answer presents a complete solution." so B and C is an independent complete solution.
upvoted 13 times

  airairo 2 months, 1 week ago


According to Whizlabs, the answers are: C, D.
upvoted 1 times

  diggity801 2 months, 1 week ago


Keyword: VIRTUAL APPLIANCE. Explain how an express route is a virtual appliance. It is not. They are implying you are using like Azure Firwall or
maybe even Meraki vMX100 if you wanted third party. C&D are the answers that make up a complete solution minus the virtual appliance which,
again, is not an Express Route because it is NOT A VIRTUAL APPLIANCE.
upvoted 1 times

  joshyz73 4 weeks ago


The virtual appliance, AKA firewall/router, is on-prem. Which means that it is the device on the business network side that connects to Azure. A
routing table works to achieve the goals, as you specify the gateway for traffic to the Azure VNET(s) as being that virtual
appliance/router/firewall. ExpressRoute should work too, since it advertises routes via BGP, so traffic destined for Azure VNET(s) would go
through that path. You'd just connect the on-prem side of the ExpressRoute circuit to the virtual appliance. Nothing says that ExpressRoute IS a
virtual appliance. The question wants all traffic to flow THROUGH the virtual appliance. Nothing in question is saying the choices ARE virtual
appliances.
upvoted 1 times

  ArifUK 1 month ago


the Virtual appliance is 'on-premises'. not in Azure.
upvoted 1 times

  Edhotp 2 months ago


for me correct answer are : C & D
because from my understanding, Site 2 Site means using VPN connection through public internet, so express route is not the right answer
upvoted 1 times

  salehz 1 month, 3 weeks ago


C and D
upvoted 1 times

  tmurfet 1 month, 3 weeks ago


Take a look at the two diagrams here:
https://docs.microsoft.com/en-us/azure/architecture/reference-architectures/hybrid-networking/expressroute#security-considerations
Scenario is similar to question and leads me to suggest B and C are correct -- remember that each is a complete solution.
upvoted 6 times

  ArifUK 1 month ago


I think B and C are the correct answers.

'You Plan to' means there are no connections exist, yet. So, we first do an Express Route and then UDR to force the tunnel.

mention of vNet is a moot point as you can't deploy VMs without having at least one vNet to begin with.
upvoted 2 times

  jva 3 weeks, 4 days ago


Weird question.

Why would you implement an express route if you already plan to use Site-to-Site connection
upvoted 2 times

  bbartek 2 weeks, 6 days ago


You can't have B and C at the same time, since with ExpressRoute (even if you also have site-to-site), forced tunneling is done by BGP, so no UDRs
are required.
upvoted 4 times

  arseyam 2 weeks, 2 days ago


Thank you, I was waiting for someone to figure out the trick :D
upvoted 1 times

  arseyam 2 weeks, 2 days ago

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 158/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

You cannot specify a virtual network gateway created as type ExpressRoute in a user-defined route because with ExpressRoute, you must use
BGP for custom routes.
https://docs.microsoft.com/en-us/azure/virtual-network/virtual-networks-udr-overview
upvoted 1 times

  rizabeer 2 weeks, 4 days ago


I will answer C and D, the only thing which is throwing off is the statement' "Each correct answer presents a complete solution"? either we ignore
this statement OR ignore "using Site-to-site connection". I guess I will assume that complete solution is a typo and will answer C&D. :)
upvoted 1 times

  aj 2 days, 7 hours ago


B&C Answer is correct, site-to-site connection will be ExpressRoute, sloppy language as usual in these exams.
upvoted 1 times

Question #17 Topic 5

You are developing a sales application that will contain several Azure cloud services and will handle different components of a transaction.
Different cloud services will process customer orders, billing, payment, inventory, and shipping.
You need to recommend a solution to enable the cloud services to asynchronously communicate transaction information by using REST
messages.
What should you include in the recommendation?

A. Azure Service Bus

B. Azure Blob storage

C. Azure Noti cation Hubs

D. Azure Application Gateway

Correct Answer: A
Service Bus is a transactional message broker and ensures transactional integrity for all internal operations against its message stores. All
transfers of messages inside of Service Bus, such as moving messages to a dead-letter queue or automatic forwarding of messages between
entities, are transactional.
Incorrect Answers:
C: Azure Noti cation Hubs is a massively scalable mobile push noti cation engine for quickly sending millions of noti cations to iOS, Android,
Windows, or Kindle devices.
Reference:
https://docs.microsoft.com/en-us/azure/service-bus-messaging/service-bus-transactions

  speedminer 5 months ago


https://docs.microsoft.com/en-us/azure/service-bus-messaging/service-bus-messaging-overview
Describes how Service bus users REST
upvoted 5 times

  pentum7 3 months, 1 week ago


Correct

" Service Bus offers a reliable and secure platform for asynchronous transfer of data and state." ... "Service Bus supports standard AMQP 1.0 and
HTTP/REST protocols."

https://docs.microsoft.com/en-us/azure/service-bus-messaging/service-bus-messaging-overview
upvoted 4 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 159/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #18 Topic 5

You are designing a message application that will run on an on-premises Ubuntu virtual machine. The application will use Azure Storage queues.
You need to recommend a processing solution for the application to interact with the storage queues. The solution must meet the following
requirements:
✑ Create and delete queues daily.
✑ Be scheduled by using a CRON job.
✑ Upload messages every ve minutes.
What should developers use to interact with the queues?

A. Azure CLI

B. AzCopy

C. Azure Data Factory

D. .NET Core

Correct Answer: D
Incorrect Answers:
A: It is not possible to have Linux running in Windows Azure
B: AzCopy is a command-line utility that you can use to copy blobs or les to or from a storage account.
Reference:
https://docs.microsoft.com/en-us/azure/storage/queues/storage-tutorial-queues

  speedminer 5 months ago


https://docs.microsoft.com/en-us/azure/storage/queues/storage-tutorial-queues?tabs=dotnet
Seems to indicate you can run .net on linux and interact with azure storage queues.
upvoted 6 times

  manderda 4 months ago


You can install Azure CLI on Ubuntu:
https://docs.microsoft.com/de-de/cli/azure/install-azure-cli-apt
So A should although be correct?
upvoted 1 times

  Yamaneko 3 months, 2 weeks ago


I would say answer is D which can insert message (upload message) https://docs.microsoft.com/en-us/azure/storage/queues/storage-dotnet-how-
to-use-queues?tabs=dotnet#insert-a-message-into-a-queue

which A can't https://docs.microsoft.com/en-us/cli/azure/storage/queue?view=azure-cli-latest


upvoted 16 times

  jhoomtv 1 month, 1 week ago


It has PUT - Adds a new message to the back of the message queue.
upvoted 1 times

  jhoomtv 1 month, 1 week ago


https://docs.microsoft.com/en-us/cli/azure/storage/message?view=azure-cli-latest
upvoted 1 times

  pentum7 3 months, 1 week ago


Agreed
upvoted 1 times

  Elecktrus 1 month, 4 weeks ago


I think that should be A, because:
- Azure CLI can be installed in Ubuntu: https://docs.microsoft.com/es-es/cli/azure/install-azure-cli-apt
- Azure CLI supports upload messages to storage queues https://docs.microsoft.com/en-us/cli/azure/storage/message?view=azure-cli-latest
- Azure CLI support create/delete queues https://docs.microsoft.com/en-us/cli/azure/storage/queue?view=azure-cli-latest
- And of course, can be scheduled by CRON https://docs.microsoft.com/es-es/azure/batch/quick-create-cli
upvoted 4 times

  ArifUK 1 month ago


All of these are good, however, only .net is a developer friendly 'development' language. probably that's why it'll be D - .net core on Linux.
upvoted 2 times

  PrashantGupta1616 1 month, 1 week ago

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 160/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

We can run .net core on linux and interact with azure storage queues. Complete details are mention in below URL.

For more info : storage-tutorial-queues-dotnet


upvoted 3 times

  jhoomtv 1 month, 1 week ago


I lost between A or D
upvoted 1 times

  Blaaa 2 weeks, 5 days ago


D is correct
upvoted 1 times

  milind8451 1 week, 4 days ago


Right ans.
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 161/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #19 Topic 5

You have a .NET web service named Service1 that has the following requirements:
✑ Must read and write temporary les to the local le system.
✑ Must write to the Application event log.
You need to recommend a solution to host Service1 in Azure. The solution must meet the following requirements:
✑ Minimize maintenance overhead.
✑ Minimize costs.
What should you include in the recommendation?

A. an App Service Environment

B. an Azure web app

C. an Azure virtual machine scale set

D. an Azure function

Correct Answer: C

  andyR 2 months ago


I think the requirement to write to the Application event log ties the solution to a VM
upvoted 5 times

  guybank 2 months ago


An azure web app lets you write to the application log and Local file system. This answer is wrong.
upvoted 2 times

  Elecktrus 1 month, 4 weeks ago


Answer is B - Azure Web App, because:
- Azure Web App can write Application log - https://docs.microsoft.com/es-es/azure/app-service/troubleshoot-diagnostic-logs
- Azure web App can read/write files to local file system - https://github.com/projectkudu/kudu/wiki/Understanding-the-Azure-App-Service-file-
system
- Azure web app is less expansive that a VM scale set (VM scale is thinked for a long number of VM)
upvoted 32 times

  tmurfet 1 month, 2 weeks ago


Answer is VM Scale Set. Take a look at discussion for #47 over here: https://www.examtopics.com/exams/microsoft/az-301/view/29/
upvoted 1 times

  ArifUK 1 month ago


The question has changed from 301 to 304 now. it's now 'Application event log' (as opposed to 'Windows event log in 301). So, answer B (as
explained by Elecktrus above) is correct.
upvoted 3 times

  freemanchen 1 month, 2 weeks ago


no, the quesion is different, one is application log, and the 29 is the windows event log. that's different
upvoted 3 times

  milind8451 1 week, 4 days ago


Answer is B as it fulfills all requirements as mentioned by Elecktrus. Thanks to him.
upvoted 2 times

  mrkhangtran 6 days, 9 hours ago


B. An Azure web app is enough
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 162/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #20 Topic 5

You are designing a microservices architecture that will support a web application.
The solution must meet the following requirements:
✑ Allow independent upgrades to each microservice.
✑ Deploy the solution on-premises and to Azure.
Set policies for performing automatic repairs to the microservices.

✑ Support low-latency and hyper-scale operations.


You need to recommend a technology.

A. Azure Container Instance

B. Azure Virtual Machine Scale Set

C. Azure Service Fabric

D. Azure Logic App

Correct Answer: C
Azure Service Fabric is a distributed systems platform that makes it easy to package, deploy, and manage scalable and reliable microservices
and containers.
You can use Azure Service Fabric to create Service Fabric clusters on any virtual machines or computers running Windows Server.
Reference:
https://docs.microsoft.com/en-us/azure/service-fabric/service-fabric-overview

  andyR 2 months ago


correct
upvoted 6 times

  Jinder 2 weeks, 1 day ago


why not ACI?
upvoted 1 times

  jonasis 1 week, 5 days ago


Bc on-premises support on ASF
upvoted 2 times

  Jinder 12 hours, 49 minutes ago


thanks. logical.
upvoted 1 times

  milind8451 1 week, 4 days ago


Service fabric is supported for on-prem scenario so right answer.
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 163/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #21 Topic 5

Your company has the infrastructure shown in the following table.

The on-premises Active Directory domain syncs to Azure Active Directory (Azure AD).
Server1 runs an application named App1 that uses LDAP queries to verify user identities in the on-premises Active Directory domain.
You plan to migrate Server1 to a virtual machine in Subscription1.
A company security policy states that the virtual machines and services deployed to Subscription1 must be prevented from accessing the on-
premises network.
You need to recommend a solution to ensure that App1 continues to function after the migration. The solution must meet the security policy.
What should you include in the recommendation?

A. Azure AD Application Proxy

B. an Azure VPN gateway

C. Azure AD Domain Services (Azure AD DS)

D. the Active Directory Domain Services role on a virtual machine

Correct Answer: D
You can join a Windows Server virtual machine to an Azure Active Directory Domain Services managed domain.
Reference:
https://docs.microsoft.com/en-us/azure/active-directory-domain-services/join-windows-vm

  mmmore 2 months ago


Both C and D seem to be correct.
Implementing AADDS would meet the policy and would allow the application to make LDAP queries.
Running a VM as Domain Controller would also solve the problem. From the description of this question, I can't see any real befit for one or the
other solution. My personal preference would be to run AADDS, as it's a PaaS offering.
upvoted 2 times

  guybank 2 months ago


The server in question is Linux though...
upvoted 1 times

  uzairahm007 1 month, 1 week ago


https://docs.microsoft.com/en-us/azure/active-directory-domain-services/join-ubuntu-linux-vm
upvoted 1 times

  andyR 2 months ago


D - then Server1 can continue to query this VM with the AD services role in Azure using LDAP against replicated AD objects from On prem
upvoted 1 times

  Elecktrus 1 month, 4 weeks ago


Answer is C - AZure Domain Services - ADS is designed for this exactly. It support LDAP querys but don't use local AD, instead use Azure AD. And
the question says that ADConnect exists and both AZure AD and local AD are sync. -
https://docs.microsoft.com/es-es/azure/active-directory-domain-services/overview
upvoted 14 times

  Kristhyan 1 month, 3 weeks ago


Correct answer is C. Azure AD support LDAP query through ADDS. A new domain controller cannot sync with on premises AD because of the
security policy.
upvoted 11 times

  wgre 1 month, 1 week ago


C
https://docs.microsoft.com/en-us/azure/active-directory/fundamentals/auth-ldap
upvoted 1 times

  ArifUK 1 month ago


Definitely C. read this from Docs 'Azure AD DS integrates with your existing Azure AD tenant. This integration lets users sign in to service and
applications connected to the managed domain using their existing credentials'. from https://docs.microsoft.com/en-us/azure/active-directory-
domain-services/overview
https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 164/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

upvoted 1 times

  kktpkktp 4 weeks, 1 day ago


Azure AD domain services is correct answer which is “c”
upvoted 1 times

  milind8451 1 week, 4 days ago


I would have choosen C) Azure AD DS as it is a manged service and solves the purpose, though D isn't wrong too.
upvoted 1 times

  vladodias 2 days, 16 hours ago


I'd go A. Azure AD Application Proxy...
Like most Azure AD hybrid agents, the Application Proxy Connector doesn't require you to open inbound connections through your firewall. User
traffic in step 3 terminates at the Application Proxy Service (in Azure AD). The Application Proxy Connector (on-premises) is responsible for the rest
of the communication.
https://docs.microsoft.com/en-us/azure/active-directory/manage-apps/application-proxy
upvoted 1 times

  bjoernhoefer 2 days, 14 hours ago


Must be C - because if you deploy an VM with Active Directory it must access the onprem AD for synchronization - and that is not permitted by the
security regulation!
upvoted 1 times

  Bullionaire 15 hours, 18 minutes ago


The correct answer is: " C " - Azure AD Domain Services (Azure AD DS)

The giveaway is in the question "A company security policy states that the virtual machines and services deployed to Subscription1 must be
prevented from accessing the on-premises network."
So this means a Domain Controller is out of the question as it will be restricted to communicate to onprem as it's a Virtual Machine. So the
alternate solution in the options is "C - Azure AD Domain Services (Azure AD DS).
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 165/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #22 Topic 5

HOTSPOT -
Your company deploys an Azure App Service Web App.
During testing the application fails under load. The application cannot handle more than 100 concurrent user sessions. You enable the Always On
feature. You also con gure auto-scaling to increase instance counts from two to 10 based on HTTP queue length.
You need to improve the performance of the application.
Which solution should you use for each application scenario? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:

Correct Answer:

Box 1: Content Delivery Network -


A content delivery network (CDN) is a distributed network of servers that can e ciently deliver web content to users. CDNs store cached
content on edge servers in point-of-presence (POP) locations that are close to end users, to minimize latency.
Azure Content Delivery Network (CDN) offers developers a global solution for rapidly delivering high-bandwidth content to users by caching
their content at strategically placed physical nodes across the world. Azure CDN can also accelerate dynamic content, which cannot be cached,
by leveraging various network optimizations using CDN POPs. For example, route optimization to bypass Border Gateway Protocol (BGP).

Box 2: Azure Redis Cache -


Azure Cache for Redis is based on the popular software Redis. It is typically used as a cache to improve the performance and scalability of
systems that rely heavily on backend data-stores. Performance is improved by temporarily copying frequently accessed data to fast storage
located close to the application. With

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 166/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Azure Cache for Redis, this fast storage is located in-memory with Azure Cache for Redis instead of being loaded from disk by a database.
Reference:
https://docs.microsoft.com/en-us/azure/azure-cache-for-redis/cache-overview

  andyR 2 months ago


Azure Traffic Manager
CDN
upvoted 1 times

  Bhuvy 1 month ago


@andyR If you dont know it for sure, then dont comment it as if you know it. It wastes other's time. Must appreciated thanks.
upvoted 7 times

  Name1s4Discussion 1 month, 3 weeks ago


People like you ruin this site. Stop posting dumb shit if you don't know. If you don't know the answer to this you shouldn't even be here. This
cert won't help you if you don't even know the most basic stuff.
upvoted 15 times

  andyR 1 month, 3 weeks ago


Thanks for patrolling the comments on 301 also:
Name1s4Discussion 1 month, 3 weeks ago
Best advice anytime you see this cloudcuckooland guy comment ignore it. He has no idea what he's talking about
upvoted 2 times

  Ziggybooboo 2 months ago


Traffic Manager does not store data, it directs to a region using DNS
upvoted 1 times

  andyR 2 months ago


was thinking TM could direct users based on closest store / region
upvoted 1 times

  gcpjay 1 month, 3 weeks ago


Given answers are correct.
upvoted 12 times

  kopper2019 1 month, 1 week ago


given answer is correct
upvoted 5 times

  milind8451 1 week, 4 days ago


Answers are correct.
upvoted 1 times

  vladodias 2 days, 16 hours ago


Given answers seems correct...
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 167/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #23 Topic 5

You use Azure virtual machines to run a custom application that uses an Azure SQL Database instance on the back end.
The IT department at your company recently enabled forced tunneling.
Since the con guration change, developers have noticed degraded performance when they access the database from the Azure virtual machine.
You need to recommend a solution to minimize latency when accessing the database. The solution must minimize costs.
What should you include in the recommendation?

A. Virtual Network (VNET) service endpoints

B. Azure virtual machines that run Microsoft SOL Server servers

C. Azure SQL Database Managed Instance

D. Always On availability groups

Correct Answer: A

  Pushkar00 1 month, 3 weeks ago


Correct answer - vnet end point will reduce latency
upvoted 5 times

  MadEgg 2 weeks ago


Correct answer
More information: https://docs.microsoft.com/de-de/azure/app-service/environment/forced-tunnel-support#configure-your-ase-with-service-
endpoints
upvoted 2 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 168/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #24 Topic 5

DRAG DROP -
You are planning an Azure solution that will host production databases for a high-performance application. The solution will include the following
components:
✑ Two virtual machines that will run Microsoft SQL Server 2016, will be deployed to different data centers in the same Azure region, and will be
part of an Always

On availability group -
✑ SQL Server data that will be backed up by using the Automated Backup feature of the SQL Server IaaS Agent Extension (SQLIaaSExtension)
You identify the storage priorities for various data types as shown in the following table.

Which storage type should you recommend for each data type? To answer, drag the appropriate storage types to the correct data types. Each
storage type may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
Select and Place:

Correct Answer:

  Ziggybooboo 2 months ago


Last box standard managed disk I think
upvoted 8 times

  shtech 4 weeks ago


No! LRS provides the lowest cost as required!
upvoted 3 times

  19042020 2 months ago


https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 169/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Agree, standard managed disk


upvoted 2 times

  hiraz007 1 month, 2 weeks ago


The standard managed disk is Incorrect. "SQL Server IaaS Agent for SQL Server 2016 supports backups directly to blob storages".
Given Answer is correct
upvoted 7 times

  she84516 1 month, 3 weeks ago


LRS storage for backups might still be on option:
SQL Server IaaS Agent for SQL Server 2016 supports backups directly to blob storages
https://docs.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/automated-backup
upvoted 2 times

  Kristhyan 1 month, 3 weeks ago


Answer is correct. Automated backup uses Azure Blob Storage.
upvoted 14 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 170/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #25 Topic 5

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that
might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
Your company plans to deploy various Azure App Service instances that will use Azure SQL databases. The App Service instances will be deployed
at the same time as the Azure SQL databases.
The company has a regulatory requirement to deploy the App Service instances only to speci c Azure regions. The resources for the App Service
instances must reside in the same region.
You need to recommend a solution to meet the regulatory requirement.
Solution: You recommend using the Regulatory compliance dashboard in Azure Security Center.
Does this meet the goal?

A. Yes

B. No

Correct Answer: B
The Regulatory compliance dashboard in Azure Security Center is not used for regional compliance.
Note: Instead Azure Resource Policy De nitions can be used which can be applied to a speci c Resource Group with the App Service instances.
Note 2: In the Azure Security Center regulatory compliance blade, you can get an overview of key portions of your compliance posture with
respect to a set of supported standards. Currently supported standards are Azure CIS, PCI DSS 3.2, ISO 27001, and SOC TSP.
Reference:
https://docs.microsoft.com/en-us/azure/governance/policy/overview https://azure.microsoft.com/en-us/blog/regulatory-compliance-
dashboard-in-azure-security-center-now-available/
Design Infrastructure

  GabrieleSolieri 1 month ago


looks like correct
upvoted 1 times

Topic 6 - Testlet 1

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 171/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #1 Topic 6

Introductory Info
Case Study -
This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However,
there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions
included on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might
contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is
independent of the other questions in this case study.
At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to
the next section of the exam. After you begin a new section, you cannot return to this section.

To start the case study -


To display the rst question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study
before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem
statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the
subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.
Existing Environment. Active Directory Environment
The network contains two Active Directory forests named corp.fabrikam.com and rd.fabrikam.com. There are no trust relationships between the
forests.
Corp.fabrikam.com is a production forest that contains identities used for internal user and computer authentication.
Rd.fabrikam.com is used by the research and development (R&D) department only.
Existing Environment. Network Infrastructure
Each o ce contains at least one domain controller from the corp.fabrikam.com domain. The main o ce contains all the domain controllers for
the rd.fabrikam.com forest.
All the o ces have a high-speed connection to the Internet.
An existing application named WebApp1 is hosted in the data center of the London o ce. WebApp1 is used by customers to place and track
orders. WebApp1 has a web tier that uses Microsoft Internet Information Services (IIS) and a database tier that runs Microsoft SQL Server 2016.
The web tier and the database tier are deployed to virtual machines that run on Hyper-V.
The IT department currently uses a separate Hyper-V environment to test updates to WebApp1.
Fabrikam purchases all Microsoft licenses through a Microsoft Enterprise Agreement that includes Software Assurance.
Existing Environment. Problem Statements
The use of WebApp1 is unpredictable. At peak times, users often report delays. At other times, many resources for WebApp1 are underutilized.

Requirements. Planned Changes -


Fabrikam plans to move most of its production workloads to Azure during the next few years.
As one of its rst projects, the company plans to establish a hybrid identity model, facilitating an upcoming Microsoft O ce 365 deployment.
All R&D operations will remain on-premises.
Fabrikam plans to migrate the production and test instances of WebApp1 to Azure and to use the S1 plan.
Requirements. Technical Requirements
Fabrikam identi es the following technical requirements:
Web site content must be easily updated from a single point.
User input must be minimized when provisioning new web app instances.
Whenever possible, existing on-premises licenses must be used to reduce cost.
Users must always authenticate by using their corp.fabrikam.com UPN identity.
Any new deployments to Azure must be redundant in case an Azure region fails.
Whenever possible, solutions must be deployed to Azure by using the Standard pricing tier of Azure App Service.
An email distribution group named IT Support must be noti ed of any issues relating to the directory synchronization services.
Directory synchronization between Azure Active Directory (Azure AD) and corp.fabrikam.com must not be affected by a link failure between Azure
and the on- premises network.
Requirements. Database Requirements
Fabrikam identi es the following database requirements:
Database metrics for the production instance of WebApp1 must be available for analysis so that database administrators can optimize the
performance settings.
To avoid disrupting customer access, database downtime must be minimized when databases are migrated.
Database backups must be retained for a minimum of seven years to meet compliance requirements.

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 172/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Requirements. Security Requirements


Fabrikam identi es the following security requirements:
Company information including policies, templates, and data must be inaccessible to anyone outside the company.
Users on the on-premises network must be able to authenticate to corp.fabrikam.com if an Internet link fails.
Administrators must be able authenticate to the Azure portal by using their corp.fabrikam.com credentials.
All administrative access to the Azure portal must be secured by using multi-factor authentication.
The testing of WebApp1 updates must not be visible to anyone outside the company.

Question
What should you include in the identity management strategy to support the planned changes?

A. Move all the domain controllers from corp.fabrikam.com to virtual networks in Azure.

B. Deploy domain controllers for the rd.fabrikam.com forest to virtual networks in Azure.

C. Deploy domain controllers for corp.fabrikam.com to virtual networks in Azure.

D. Deploy a new Azure AD tenant for the authentication of new R&D projects.

Correct Answer: C
Directory synchronization between Azure Active Directory (Azure AD) and corp.fabrikam.com must not be affected by a link failure between
Azure and the on- premises network. (This requires domain controllers in Azure)
Users on the on-premises network must be able to authenticate to corp.fabrikam.com if an Internet link fails. (This requires domain controllers
on-premises)

  pentum7 3 months, 1 week ago


Corrrect
upvoted 4 times

  tmurfet 2 months, 3 weeks ago


Fuzzy logic in the published answer: Having a DC in Azure doesn't help with sync when the Internet link goes down as the server running Azure AD
Connect will be isolated. Besides which every on-prem location has a corp DC so the on-premises AD directory replication will be cutoff as well, a
more serious issue than sync on working IMO. Better solution would involve highly available and redundant Internet links.
upvoted 1 times

  buanilk 2 weeks, 3 days ago


point is to have the active users availability in DC whenever internet link goes down.
upvoted 1 times

  andyR 2 months, 1 week ago


you just change your pwd on prem and the link goes - no synchonization to Azure
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 173/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #2 Topic 6

Introductory Info
Case Study -
This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However,
there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions
included on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might
contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is
independent of the other questions in this case study.
At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to
the next section of the exam. After you begin a new section, you cannot return to this section.

To start the case study -


To display the rst question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study
before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem
statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the
subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.
Existing Environment. Active Directory Environment
The network contains two Active Directory forests named corp.fabrikam.com and rd.fabrikam.com. There are no trust relationships between the
forests.
Corp.fabrikam.com is a production forest that contains identities used for internal user and computer authentication.
Rd.fabrikam.com is used by the research and development (R&D) department only.
Existing Environment. Network Infrastructure
Each o ce contains at least one domain controller from the corp.fabrikam.com domain. The main o ce contains all the domain controllers for
the rd.fabrikam.com forest.
All the o ces have a high-speed connection to the Internet.
An existing application named WebApp1 is hosted in the data center of the London o ce. WebApp1 is used by customers to place and track
orders. WebApp1 has a web tier that uses Microsoft Internet Information Services (IIS) and a database tier that runs Microsoft SQL Server 2016.
The web tier and the database tier are deployed to virtual machines that run on Hyper-V.
The IT department currently uses a separate Hyper-V environment to test updates to WebApp1.
Fabrikam purchases all Microsoft licenses through a Microsoft Enterprise Agreement that includes Software Assurance.
Existing Environment. Problem Statements
The use of WebApp1 is unpredictable. At peak times, users often report delays. At other times, many resources for WebApp1 are underutilized.

Requirements. Planned Changes -


Fabrikam plans to move most of its production workloads to Azure during the next few years.
As one of its rst projects, the company plans to establish a hybrid identity model, facilitating an upcoming Microsoft O ce 365 deployment.
All R&D operations will remain on-premises.
Fabrikam plans to migrate the production and test instances of WebApp1 to Azure and to use the S1 plan.
Requirements. Technical Requirements
Fabrikam identi es the following technical requirements:
Web site content must be easily updated from a single point.
User input must be minimized when provisioning new web app instances.
Whenever possible, existing on-premises licenses must be used to reduce cost.
Users must always authenticate by using their corp.fabrikam.com UPN identity.
Any new deployments to Azure must be redundant in case an Azure region fails.
Whenever possible, solutions must be deployed to Azure by using the Standard pricing tier of Azure App Service.
An email distribution group named IT Support must be noti ed of any issues relating to the directory synchronization services.
Directory synchronization between Azure Active Directory (Azure AD) and corp.fabrikam.com must not be affected by a link failure between Azure
and the on- premises network.
Requirements. Database Requirements
Fabrikam identi es the following database requirements:
Database metrics for the production instance of WebApp1 must be available for analysis so that database administrators can optimize the
performance settings.
To avoid disrupting customer access, database downtime must be minimized when databases are migrated.
Database backups must be retained for a minimum of seven years to meet compliance requirements.

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 174/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Requirements. Security Requirements


Fabrikam identi es the following security requirements:
Company information including policies, templates, and data must be inaccessible to anyone outside the company.
Users on the on-premises network must be able to authenticate to corp.fabrikam.com if an Internet link fails.
Administrators must be able authenticate to the Azure portal by using their corp.fabrikam.com credentials.
All administrative access to the Azure portal must be secured by using multi-factor authentication.
The testing of WebApp1 updates must not be visible to anyone outside the company.

Question
HOTSPOT -
To meet the authentication requirements of Fabrikam, what should you include in the solution? To answer, select the appropriate options in the
answer area.
NOTE: Each correct selection is worth one point.
Hot Area:

Correct Answer:

Box 1: 2 -
The network contains two Active Directory forests named corp.fabrikam.com and rd.fabrikam.com. There are no trust relationships between
the forests.
https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 175/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Box 2: 1 -

Box 3: 1 -
Scenario:
✑ Users on the on-premises network must be able to authenticate to corp.fabrikam.com if an Internet link fails.
✑ Administrators must be able authenticate to the Azure portal by using their corp.fabrikam.com credentials.
✑ All administrative access to the Azure portal must be secured by using multi-factor authentication.
Note:
Users must always authenticate by using their corp.fabrikam.com UPN identity.
The network contains two Active Directory forests named corp.fabrikam.com and rd.fabrikam.com. There are no trust relationships between
the forests.
Corp.fabrikam.com is a production forest that contains identities used for internal user and computer authentication.
Rd.fabrikam.com is used by the research and development (R&D) department only.
Design Identity and Security

  Edhotp 2 months ago


it should be 1,1,2
upvoted 5 times

  Jinder 12 hours, 43 minutes ago


correct
upvoted 1 times

  devianter81 2 months ago


1,1,1. only corp tenant will be migrated to azure. use mfa for all administartive access will require just 1 conditional access policy
upvoted 24 times

  KWISP 1 month, 3 weeks ago


Correct, "All R&D operations will remain on-premises." So no need to create a second tenant. 1.1.1.
upvoted 2 times

  tmurfet 1 month, 2 weeks ago


2.1.1., published answer is correct: "test instances of WebApp1 go to Azure" --- since there is no trust between corp and rd AD domains, there will
need to be a second rd tenant syncing with AD Connect in the rd on-prem AD domain. For dev it's a good practice to use a separate tenant.
upvoted 1 times

  joshyz73 1 month ago


The question never says that the test instances of WebApp1 are in the rd.fabrikam.com forest. I think you're assuming that "R&D" is the same as
"Dev/Test". In this case study, "R&D" is a separate department, and WebApp1 not "theirs".
upvoted 5 times

  brummens 1 month, 2 weeks ago


1,2,1. only corp tenant will be migrated to azure, production and test instances of WebApp1 are migrated to Azure and use the AppService S1 plan
so 2 custom domains need to be added, use mfa for all administrative access will require just 1 conditional access policy
upvoted 2 times

  olivier_s 1 month, 1 week ago


"The testing of WebApp1 updates must not be visible to anyone outside the company." would suggest 1,1,1 (only corp tenant , only one custom
domain for prod , only 1 conditional access policy)
upvoted 3 times

  arseyam 2 weeks ago


I will go for 1,1,0 because enforcing MFA doesn't really require a conditional access policy and can be done from the Users blade in Azure AD.
Conditional Access Policies require Azure AD Premium P1 licenses and the question didn't mention anything about the license type.
upvoted 1 times

  MadEgg 2 weeks ago


Yes you can enforce MFA for specific user directly/manually in Azure AD but the question is about to activate it for all admins (as a role). So i
think the answer they want too see is 1.
upvoted 1 times

  milind8451 1 week, 4 days ago


1,1,1 as RD domain will not be moved to Azure as mentioned.
upvoted 1 times

  Bullionaire 4 days, 11 hours ago


After a fair bit of digging I've came to the conclusion of:
1,1,1
1 tenant as its one organization

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 176/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

1 custom domain as only fabrikam are coming over into Azure


1 conditional access policy - as this is Microsoft best practices recommendations, see here: https://docs.microsoft.com/en-us/azure/active-
directory/conditional-access/howto-conditional-access-policy-admin-mfa

Final = 1,1,1
upvoted 2 times

Topic 7 - Testlet 2

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 177/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #1 Topic 7

Introductory Info
Case Study -
This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However,
there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions
included on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might
contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is
independent of the other questions in this case study.
At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to
the next section of the exam. After you begin a new section, you cannot return to this section.

To start the case study -


To display the rst question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study
before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem
statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the
subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.

Overview -
Contoso, Ltd, is a US-based nancial services company that has a main o ce in New York and a branch o ce in San Francisco.
Existing Environment. Payment Processing System
Contoso hosts a business-critical payment processing system in its New York data center. The system has three tiers: a front-end web app, a
middle-tier web API, and a back-end data store implemented as a Microsoft SQL Server 2014 database. All servers run Windows Server 2012 R2.
The front-end and middle-tier components are hosted by using Microsoft Internet Information Services (IIS). The application code is written in C#
and ASP.NET.
The middle-tier API uses the Entity Framework to communicate to the SQL Server database. Maintenance of the database is performed by using
SQL Server
Agent jobs.
The database is currently 2 TB and is not expected to grow beyond 3 TB.
The payment processing system has the following compliance-related requirements:
Encrypt data in transit and at rest. Only the front-end and middle-tier components must be able to access the encryption keys that protect the data
store.
Keep backups of the data in two separate physical locations that are at least 200 miles apart and can be restored for up to seven years.

Support blocking inbound and outbound tra c based on the source IP address, the destination IP address, and the port number.
Collect Windows security logs from all the middle-tier servers and retain the logs for a period of seven years.
Inspect inbound and outbound tra c from the front-end tier by using highly available network appliances.
Only allow all access to all the tiers from the internal network of Contoso.
Tape backups are con gured by using an on-premises deployment of Microsoft System Center Data Protection Manager (DPM), and then shipped
offsite for long term storage.
Existing Environment. Historical Transaction Query System
Contoso recently migrated a business-critical workload to Azure. The workload contains a .NET web service for querying the historical transaction
data residing in
Azure Table Storage. The .NET web service is accessible from a client app that was developed in-house and runs on the client computers in the
New York o ce.
The data in the table storage is 50 GB and is not expected to increase.
Existing Environment. Current Issues
The Contoso IT team discovers poor performance of the historical transaction query system, as the queries frequently cause table scans.

Requirements. Planned Changes -


Contoso plans to implement the following changes:
Migrate the payment processing system to Azure.
Migrate the historical transaction data to Azure Cosmos DB to address the performance issues.
Requirements. Migration Requirements
Contoso identi es the following general migration requirements:
Infrastructure services must remain available if a region or a data center fails. Failover must occur without any administrative intervention.

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 178/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Whenever possible, Azure managed services must be used to minimize management overhead.
Whenever possible, costs must be minimized.
Contoso identi es the following requirements for the payment processing system:
If a data center fails, ensure that the payment processing system remains available without any administrative intervention. The middle-tier and
the web front end must continue to operate without any additional con gurations.
Ensure that the number of compute nodes of the front-end and the middle tiers of the payment processing system can increase or decrease
automatically based on CPU utilization.
Ensure that each tier of the payment processing system is subject to a Service Level Agreement (SLA) of 99.99 percent availability.
Minimize the effort required to modify the middle-tier API and the back-end tier of the payment processing system.
Payment processing system must be able to use grouping and joining tables on encrypted columns.

Generate alerts when unauthorized login attempts occur on the middle-tier virtual machines.
Ensure that the payment processing system preserves its current compliance status.
Host the middle tier of the payment processing system on a virtual machine
Contoso identi es the following requirements for the historical transaction query system:
Minimize the use of on-premises infrastructure services.
Minimize the effort required to modify the .NET web service querying Azure Cosmos DB.
Minimize the frequency of table scans.
If a region fails, ensure that the historical transaction query system remains available without any administrative intervention.
Requirements. Information Security Requirements
The IT security team wants to ensure that identity management is performed by using Active Directory. Password hashes must be stored on-
premises only.
Access to all business-critical systems must rely on Active Directory credentials. Any suspicious authentication attempts must trigger a multi-
factor authentication prompt automatically.

Question
You need to recommend a solution for protecting the content of the payment processing system.
What should you include in the recommendation?

A. Always Encrypted with deterministic encryption

B. Always Encrypted with randomized encryption

C. Transparent Data Encryption (TDE)

D. Azure Storage Service Encryption

Correct Answer: A

  speedminer 5 months ago


Seems correct, as randomized encryption would not allow for grouping/joining of data

https://docs.microsoft.com/en-us/sql/relational-databases/security/encryption/always-encrypted-database-engine?view=sql-server-
ver15#selecting--deterministic-or-randomized-encryption
upvoted 16 times

  SHABS78 3 months, 1 week ago


I believe it a correct Answer.
upvoted 2 times

  olivier_s 1 month, 1 week ago


from:
"Only the front-end and middle-tier components must be able to access the encryption keys that protect the data store."
and the ask to minimize the changes to the middle tier & backend tier , would trend to TDE with BYOC : https://docs.microsoft.com/en-
us/azure/azure-sql/database/transparent-data-encryption-byok-overview
upvoted 1 times

  wgre 1 month, 1 week ago


TDE does not encrypt data on transit
upvoted 1 times

  Blaaa 2 weeks, 4 days ago


Correct
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 179/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 180/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #2 Topic 7

Introductory Info
Case Study -
This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However,
there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions
included on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might
contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is
independent of the other questions in this case study.
At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to
the next section of the exam. After you begin a new section, you cannot return to this section.

To start the case study -


To display the rst question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study
before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem
statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the
subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.

Overview -
Contoso, Ltd, is a US-based nancial services company that has a main o ce in New York and a branch o ce in San Francisco.
Existing Environment. Payment Processing System
Contoso hosts a business-critical payment processing system in its New York data center. The system has three tiers: a front-end web app, a
middle-tier web API, and a back-end data store implemented as a Microsoft SQL Server 2014 database. All servers run Windows Server 2012 R2.
The front-end and middle-tier components are hosted by using Microsoft Internet Information Services (IIS). The application code is written in C#
and ASP.NET.
The middle-tier API uses the Entity Framework to communicate to the SQL Server database. Maintenance of the database is performed by using
SQL Server
Agent jobs.
The database is currently 2 TB and is not expected to grow beyond 3 TB.
The payment processing system has the following compliance-related requirements:
Encrypt data in transit and at rest. Only the front-end and middle-tier components must be able to access the encryption keys that protect the data
store.
Keep backups of the data in two separate physical locations that are at least 200 miles apart and can be restored for up to seven years.

Support blocking inbound and outbound tra c based on the source IP address, the destination IP address, and the port number.
Collect Windows security logs from all the middle-tier servers and retain the logs for a period of seven years.
Inspect inbound and outbound tra c from the front-end tier by using highly available network appliances.
Only allow all access to all the tiers from the internal network of Contoso.
Tape backups are con gured by using an on-premises deployment of Microsoft System Center Data Protection Manager (DPM), and then shipped
offsite for long term storage.
Existing Environment. Historical Transaction Query System
Contoso recently migrated a business-critical workload to Azure. The workload contains a .NET web service for querying the historical transaction
data residing in
Azure Table Storage. The .NET web service is accessible from a client app that was developed in-house and runs on the client computers in the
New York o ce.
The data in the table storage is 50 GB and is not expected to increase.
Existing Environment. Current Issues
The Contoso IT team discovers poor performance of the historical transaction query system, as the queries frequently cause table scans.

Requirements. Planned Changes -


Contoso plans to implement the following changes:
Migrate the payment processing system to Azure.
Migrate the historical transaction data to Azure Cosmos DB to address the performance issues.
Requirements. Migration Requirements
Contoso identi es the following general migration requirements:
Infrastructure services must remain available if a region or a data center fails. Failover must occur without any administrative intervention.

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 181/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Whenever possible, Azure managed services must be used to minimize management overhead.
Whenever possible, costs must be minimized.
Contoso identi es the following requirements for the payment processing system:
If a data center fails, ensure that the payment processing system remains available without any administrative intervention. The middle-tier and
the web front end must continue to operate without any additional con gurations.
Ensure that the number of compute nodes of the front-end and the middle tiers of the payment processing system can increase or decrease
automatically based on CPU utilization.
Ensure that each tier of the payment processing system is subject to a Service Level Agreement (SLA) of 99.99 percent availability.
Minimize the effort required to modify the middle-tier API and the back-end tier of the payment processing system.
Payment processing system must be able to use grouping and joining tables on encrypted columns.

Generate alerts when unauthorized login attempts occur on the middle-tier virtual machines.
Ensure that the payment processing system preserves its current compliance status.
Host the middle tier of the payment processing system on a virtual machine
Contoso identi es the following requirements for the historical transaction query system:
Minimize the use of on-premises infrastructure services.
Minimize the effort required to modify the .NET web service querying Azure Cosmos DB.
Minimize the frequency of table scans.
If a region fails, ensure that the historical transaction query system remains available without any administrative intervention.
Requirements. Information Security Requirements
The IT security team wants to ensure that identity management is performed by using Active Directory. Password hashes must be stored on-
premises only.
Access to all business-critical systems must rely on Active Directory credentials. Any suspicious authentication attempts must trigger a multi-
factor authentication prompt automatically.

Question
HOTSPOT -
You need to recommend a solution for the data store of the historical transaction query system.
What should you include in the recommendation? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 182/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Correct Answer:

Design Business Continuity

  andyR 2 months ago


correct
upvoted 1 times

  salehz 1 month, 3 weeks ago


Multiple tables fixed capacity
upvoted 3 times

  xAlx 1 month, 3 weeks ago


Correct
Table with fixed capacity:
in requirements - "The data in the Azure Table storage is 50 GB and is not expected to increase"
Additional read region:
availiability zone or set doesn't help during region gailure
upvoted 5 times

  wgre 1 month, 1 week ago


Correct
data stored on Azure Tables does not depend on complex server-side joins or other logic
upvoted 2 times

  Blaaa 2 weeks, 4 days ago


Correct
upvoted 1 times

Topic 8 - Testlet 3

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 183/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #1 Topic 8

Introductory Info
Case Study -
This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However,
there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions
included on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might
contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is
independent of the other questions in this case study.
At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to
the next section of the exam. After you begin a new section, you cannot return to this section.

To start the case study -


To display the rst question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study
before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem
statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the
subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.

Overview -
Contoso, Ltd, is a US-based nancial services company that has a main o ce in New York and a branch o ce in San Francisco.
Existing Environment. Payment Processing System
Contoso hosts a business-critical payment processing system in its New York data center. The system has three tiers: a front-end web app, a
middle-tier web API, and a back-end data store implemented as a Microsoft SQL Server 2014 database. All servers run Windows Server 2012 R2.
The front-end and middle-tier components are hosted by using Microsoft Internet Information Services (IIS). The application code is written in C#
and ASP.NET.
The middle-tier API uses the Entity Framework to communicate to the SQL Server database. Maintenance of the database is performed by using
SQL Server
Agent jobs.
The database is currently 2 TB and is not expected to grow beyond 3 TB.
The payment processing system has the following compliance-related requirements:
Encrypt data in transit and at rest. Only the front-end and middle-tier components must be able to access the encryption keys that protect the data
store.
Keep backups of the data in two separate physical locations that are at least 200 miles apart and can be restored for up to seven years.

Support blocking inbound and outbound tra c based on the source IP address, the destination IP address, and the port number.
Collect Windows security logs from all the middle-tier servers and retain the logs for a period of seven years.
Inspect inbound and outbound tra c from the front-end tier by using highly available network appliances.
Only allow all access to all the tiers from the internal network of Contoso.
Tape backups are con gured by using an on-premises deployment of Microsoft System Center Data Protection Manager (DPM), and then shipped
offsite for long term storage.
Existing Environment. Historical Transaction Query System
Contoso recently migrated a business-critical workload to Azure. The workload contains a .NET web service for querying the historical transaction
data residing in
Azure Table Storage. The .NET web service is accessible from a client app that was developed in-house and runs on the client computers in the
New York o ce.
The data in the table storage is 50 GB and is not expected to increase.
Existing Environment. Current Issues
The Contoso IT team discovers poor performance of the historical transaction query system, as the queries frequently cause table scans.

Requirements. Planned Changes -


Contoso plans to implement the following changes:
Migrate the payment processing system to Azure.
Migrate the historical transaction data to Azure Cosmos DB to address the performance issues.
Requirements. Migration Requirements
Contoso identi es the following general migration requirements:
Infrastructure services must remain available if a region or a data center fails. Failover must occur without any administrative intervention.

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 184/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Whenever possible, Azure managed services must be used to minimize management overhead.
Whenever possible, costs must be minimized.
Contoso identi es the following requirements for the payment processing system:
If a data center fails, ensure that the payment processing system remains available without any administrative intervention. The middle-tier and
the web front end must continue to operate without any additional con gurations.
Ensure that the number of compute nodes of the front-end and the middle tiers of the payment processing system can increase or decrease
automatically based on CPU utilization.
Ensure that each tier of the payment processing system is subject to a Service Level Agreement (SLA) of 99.99 percent availability.
Minimize the effort required to modify the middle-tier API and the back-end tier of the payment processing system.
Payment processing system must be able to use grouping and joining tables on encrypted columns.

Generate alerts when unauthorized login attempts occur on the middle-tier virtual machines.
Ensure that the payment processing system preserves its current compliance status.
Host the middle tier of the payment processing system on a virtual machine
Contoso identi es the following requirements for the historical transaction query system:
Minimize the use of on-premises infrastructure services.
Minimize the effort required to modify the .NET web service querying Azure Cosmos DB.
Minimize the frequency of table scans.
If a region fails, ensure that the historical transaction query system remains available without any administrative intervention.
Requirements. Information Security Requirements
The IT security team wants to ensure that identity management is performed by using Active Directory. Password hashes must be stored on-
premises only.
Access to all business-critical systems must rely on Active Directory credentials. Any suspicious authentication attempts must trigger a multi-
factor authentication prompt automatically.

Question
You need to recommend a backup solution for the data store of the payment processing system.
What should you include in the recommendation?

A. Microsoft System Center Data Protection Manager (DPM)

B. Azure Backup Server

C. Azure SQL long-term backup retention

D. Azure Managed Disks

Correct Answer: C
Reference:
https://docs.microsoft.com/en-us/azure/sql-database/sql-database-long-term-backup-retention-con gure
Design Business Continuity

  David_986969 4 months, 1 week ago


Why is this the answer?
upvoted 1 times

  David_986969 4 months, 1 week ago


Like it says "Database backups must be retained for a minimum of seven years to meet compliance requirements." So Azure backup won´t work
you will have to use long term backup that can hold the data up to sevenn years........So Answer is correct
upvoted 13 times

  pentum7 3 months, 1 week ago


"In Azure SQL Database, you can configure a database with a long-term backup retention policy (LTR) to automatically retain the database
backups in separate Azure Blob storage containers for up to 10 years. "
https://docs.microsoft.com/en-us/azure/azure-sql/database/long-term-backup-retention-configure
upvoted 7 times

  arseyam 1 week, 6 days ago


The answers are confusing, it should have contained Azure Recovery Services Vault as an option not the backup policy type

Backup SQL Server on Azure VM


https://docs.microsoft.com/en-us/azure/backup/backup-sql-server-database-azure-vms#database-naming-guidelines-for-azure-backup

Backup policies
https://docs.microsoft.com/en-us/azure/backup/backup-architecture#sql-server-backup-types

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 185/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Retention for "monthly", "yearly" backup points is referred to as Long Term Retention (LTR)
upvoted 1 times

Topic 9 - Testlet 4

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 186/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #1 Topic 9

Introductory Info
Case Study -
This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However,
there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions
included on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might
contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is
independent of the other questions in this case study.
At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to
the next section of the exam. After you begin a new section, you cannot return to this section.

To start the case study -


To display the rst question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study
before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem
statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the
subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.
Existing Environment. Active Directory Environment
The network contains two Active Directory forests named corp.fabrikam.com and rd.fabrikam.com. There are no trust relationships between the
forests.
Corp.fabrikam.com is a production forest that contains identities used for internal user and computer authentication.
Rd.fabrikam.com is used by the research and development (R&D) department only.
Existing Environment. Network Infrastructure
Each o ce contains at least one domain controller from the corp.fabrikam.com domain. The main o ce contains all the domain controllers for
the rd.fabrikam.com forest.
All the o ces have a high-speed connection to the Internet.
An existing application named WebApp1 is hosted in the data center of the London o ce. WebApp1 is used by customers to place and track
orders. WebApp1 has a web tier that uses Microsoft Internet Information Services (IIS) and a database tier that runs Microsoft SQL Server 2016.
The web tier and the database tier are deployed to virtual machines that run on Hyper-V.
The IT department currently uses a separate Hyper-V environment to test updates to WebApp1.
Fabrikam purchases all Microsoft licenses through a Microsoft Enterprise Agreement that includes Software Assurance.
Existing Environment. Problem Statements
The use of WebApp1 is unpredictable. At peak times, users often report delays. At other times, many resources for WebApp1 are underutilized.

Requirements. Planned Changes -


Fabrikam plans to move most of its production workloads to Azure during the next few years.
As one of its rst projects, the company plans to establish a hybrid identity model, facilitating an upcoming Microsoft O ce 365 deployment.
All R&D operations will remain on-premises.
Fabrikam plans to migrate the production and test instances of WebApp1 to Azure and to use the S1 plan.
Requirements. Technical Requirements
Fabrikam identi es the following technical requirements:
Web site content must be easily updated from a single point.
User input must be minimized when provisioning new web app instances.
Whenever possible, existing on-premises licenses must be used to reduce cost.
Users must always authenticate by using their corp.fabrikam.com UPN identity.
Any new deployments to Azure must be redundant in case an Azure region fails.
Whenever possible, solutions must be deployed to Azure by using the Standard pricing tier of Azure App Service.
An email distribution group named IT Support must be noti ed of any issues relating to the directory synchronization services.
Directory synchronization between Azure Active Directory (Azure AD) and corp.fabrikam.com must not be affected by a link failure between Azure
and the on- premises network.
Requirements. Database Requirements
Fabrikam identi es the following database requirements:
Database metrics for the production instance of WebApp1 must be available for analysis so that database administrators can optimize the
performance settings.
To avoid disrupting customer access, database downtime must be minimized when databases are migrated.
Database backups must be retained for a minimum of seven years to meet compliance requirements.

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 187/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Requirements. Security Requirements


Fabrikam identi es the following security requirements:
Company information including policies, templates, and data must be inaccessible to anyone outside the company.
Users on the on-premises network must be able to authenticate to corp.fabrikam.com if an Internet link fails.
Administrators must be able authenticate to the Azure portal by using their corp.fabrikam.com credentials.
All administrative access to the Azure portal must be secured by using multi-factor authentication.
The testing of WebApp1 updates must not be visible to anyone outside the company.

Question
You need to recommend a solution to meet the database retention requirement.
What should you recommend?

A. Con gure geo-replication of the database.

B. Con gure a long-term retention policy for the database.

C. Con gure Azure Site Recovery.

D. Use automatic Azure SQL Database backups.

Correct Answer: B
Design Infrastructure

  speedminer 5 months ago


https://docs.microsoft.com/en-us/azure/azure-sql/database/long-term-retention-overview
Seems accurate
upvoted 2 times

  pentum7 3 months, 1 week ago


Agreed

In Azure SQL Database, you can configure a database with a long-term backup retention policy (LTR) to automatically retain the database
backups in separate Azure Blob storage containers for up to 10 years.
upvoted 4 times

  Blaaa 2 weeks, 4 days ago


Correct
upvoted 1 times

Topic 10 - Testlet 5

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 188/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #1 Topic 10

Introductory Info
Case Study -
This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However,
there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions
included on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might
contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is
independent of the other questions in this case study.
At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to
the next section of the exam. After you begin a new section, you cannot return to this section.

To start the case study -


To display the rst question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study
before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem
statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the
subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.

Overview -
Contoso, Ltd, is a US-based nancial services company that has a main o ce in New York and a branch o ce in San Francisco.
Existing Environment. Payment Processing System
Contoso hosts a business-critical payment processing system in its New York data center. The system has three tiers: a front-end web app, a
middle-tier web API, and a back-end data store implemented as a Microsoft SQL Server 2014 database. All servers run Windows Server 2012 R2.
The front-end and middle-tier components are hosted by using Microsoft Internet Information Services (IIS). The application code is written in C#
and ASP.NET.
The middle-tier API uses the Entity Framework to communicate to the SQL Server database. Maintenance of the database is performed by using
SQL Server
Agent jobs.
The database is currently 2 TB and is not expected to grow beyond 3 TB.
The payment processing system has the following compliance-related requirements:
Encrypt data in transit and at rest. Only the front-end and middle-tier components must be able to access the encryption keys that protect the data
store.
Keep backups of the data in two separate physical locations that are at least 200 miles apart and can be restored for up to seven years.

Support blocking inbound and outbound tra c based on the source IP address, the destination IP address, and the port number.
Collect Windows security logs from all the middle-tier servers and retain the logs for a period of seven years.
Inspect inbound and outbound tra c from the front-end tier by using highly available network appliances.
Only allow all access to all the tiers from the internal network of Contoso.
Tape backups are con gured by using an on-premises deployment of Microsoft System Center Data Protection Manager (DPM), and then shipped
offsite for long term storage.
Existing Environment. Historical Transaction Query System
Contoso recently migrated a business-critical workload to Azure. The workload contains a .NET web service for querying the historical transaction
data residing in
Azure Table Storage. The .NET web service is accessible from a client app that was developed in-house and runs on the client computers in the
New York o ce.
The data in the table storage is 50 GB and is not expected to increase.
Existing Environment. Current Issues
The Contoso IT team discovers poor performance of the historical transaction query system, as the queries frequently cause table scans.

Requirements. Planned Changes -


Contoso plans to implement the following changes:
Migrate the payment processing system to Azure.
Migrate the historical transaction data to Azure Cosmos DB to address the performance issues.
Requirements. Migration Requirements
Contoso identi es the following general migration requirements:
Infrastructure services must remain available if a region or a data center fails. Failover must occur without any administrative intervention.

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 189/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Whenever possible, Azure managed services must be used to minimize management overhead.
Whenever possible, costs must be minimized.
Contoso identi es the following requirements for the payment processing system:
If a data center fails, ensure that the payment processing system remains available without any administrative intervention. The middle-tier and
the web front end must continue to operate without any additional con gurations.
Ensure that the number of compute nodes of the front-end and the middle tiers of the payment processing system can increase or decrease
automatically based on CPU utilization.
Ensure that each tier of the payment processing system is subject to a Service Level Agreement (SLA) of 99.99 percent availabilty.
Minimize the effort required to modify the middle-tier API and the back-end tier of the payment processing system.
Payment processing system must be able to use grouping and joining tables on encrypted columns.

Generate alerts when unauthorized login attempts occur on the middle-tier virtual machines.
Ensure that the payment processing system preserves its current compliance status.
Host the middle tier of the payment processing system on a virtual machine
Contoso identi es the following requirements for the historical transaction query system:
Minimize the use of on-premises infrastructure services.
Minimize the effort required to modify the .NET web service querying Azure Cosmos DB.
Minimize the frequency of table scans.
If a region fails, ensure that the historical transaction query system remains available without any administrative intervention.
Requirements. Information Security Requirements
The IT security team wants to ensure that identity management is performed by using Active Directory. Password hashes must be stored on-
premises only.
Access to all business-critical systems must rely on Active Directory credentials. Any suspicious authentication attempts must trigger a multi-
factor authentication prompt automatically.

Question
You need to recommend a compute solution for the middle tier of the payment processing system.
What should you include in the recommendation?

A. virtual machine scale sets

B. availability sets

C. Azure Kubernetes Service (AKS)

D. Function App

Correct Answer: A
Design Infrastructure

  David_986969 4 months, 1 week ago


This would not be complient with the requierment "If a data center fails, ensure that the payment processing system remains available without any
administrative intervention. The middle-tier and the web front end must continue to operate without any additional configurations."
upvoted 3 times

  Marang73 4 months ago


SLA of 99,99% with scalesets https://docs.microsoft.com/en-us/azure/virtual-machine-scale-sets/virtual-machine-scale-sets-use-availability-
zones
upvoted 2 times

  Tos_123 4 months ago


A
it fits the requirement:
"Ensure that the number of compute nodes of the front-end and the middle tiers of the payment processing system can increase or decrease
automatically based on CPU utilization."
upvoted 4 times

  alekam 3 months ago


Additionally it says "Host the middle tier of the payment processing system on a virtual machine" which is also pointing into the direction of A
upvoted 4 times

  Galexy 3 months ago


To me, the key points:
-middle tiers of the payment processing system can increase or decrease automatically based on CPU utilization
-Minimize the effort required to modify the middle-tier API 

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 190/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

-The middle-tier and the web front end must continue to operate without any additional configurations
All leading to A
upvoted 7 times

  pentum7 2 months, 4 weeks ago


To protect your virtual machine scale sets from datacenter-level failures, you can create a scale set across Availability Zones. Azure regions that
support Availability Zones have a minimum of three separate zones, each with their own independent power source, network, and cooling. For
more information"
https://docs.microsoft.com/en-us/azure/virtual-machine-scale-sets/virtual-machine-scale-sets-use-availability-zones
upvoted 3 times

  Hanger_Man 2 months, 1 week ago


''A'- Correct Answer
upvoted 3 times

Topic 11 - Testlet 6

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 191/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #1 Topic 11

Introductory Info
Case Study -
This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However,
there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions
included on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might
contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is
independent of the other questions in this case study.
At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to
the next section of the exam. After you begin a new section, you cannot return to this section.

To start the case study -


To display the rst question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study
before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem
statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the
subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.
Existing Environment. Active Directory Environment
The network contains two Active Directory forests named corp.fabrikam.com and rd.fabrikam.com. There are no trust relationships between the
forests.
Corp.fabrikam.com is a production forest that contains identities used for internal user and computer authentication.
Rd.fabrikam.com is used by the research and development (R&D) department only.
Existing Environment. Network Infrastructure
Each o ce contains at least one domain controller from the corp.fabrikam.com domain. The main o ce contains all the domain controllers for
the rd.fabrikam.com forest.
All the o ces have a high-speed connection to the Internet.
An existing application named WebApp1 is hosted in the data center of the London o ce. WebApp1 is used by customers to place and track
orders. WebApp1 has a web tier that uses Microsoft Internet Information Services (IIS) and a database tier that runs Microsoft SQL Server 2016.
The web tier and the database tier are deployed to virtual machines that run on Hyper-V.
The IT department currently uses a separate Hyper-V environment to test updates to WebApp1.
Fabrikam purchases all Microsoft licenses through a Microsoft Enterprise Agreement that includes Software Assurance.
Existing Environment. Problem Statements
The use of WebApp1 is unpredictable. At peak times, users often report delays. At other times, many resources for WebApp1 are underutilized.

Requirements. Planned Changes -


Fabrikam plans to move most of its production workloads to Azure during the next few years.
As one of its rst projects, the company plans to establish a hybrid identity model, facilitating an upcoming Microsoft O ce 365 deployment.
All R&D operations will remain on-premises.
Fabrikam plans to migrate the production and test instances of WebApp1 to Azure and to use the S1 plan.
Requirements. Technical Requirements
Fabrikam identi es the following technical requirements:
Web site content must be easily updated from a single point.
User input must be minimized when provisioning new web app instances.
Whenever possible, existing on-premises licenses must be used to reduce cost.
Users must always authenticate by using their corp.fabrikam.com UPN identity.
Any new deployments to Azure must be redundant in case an Azure region fails.
Whenever possible, solutions must be deployed to Azure by using the Standard pricing tier of Azure App Service.
An email distribution group named IT Support must be noti ed of any issues relating to the directory synchronization services.
Directory synchronization between Azure Active Directory (Azure AD) and corp.fabrikam.com must not be affected by a link failure between Azure
and the on- premises network.
Requirements. Database Requirements
Fabrikam identi es the following database requirements:
Database metrics for the production instance of WebApp1 must be available for analysis so that database administrators can optimize the
performance settings.
To avoid disrupting customer access, database downtime must be minimized when databases are migrated.
Database backups must be retained for a minimum of seven years to meet compliance requirements.

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 192/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Requirements. Security Requirements


Fabrikam identi es the following security requirements:
Company information including policies, templates, and data must be inaccessible to anyone outside the company.
Users on the on-premises network must be able to authenticate to corp.fabrikam.com if an Internet link fails.
Administrators must be able authenticate to the Azure portal by using their corp.fabrikam.com credentials.
All administrative access to the Azure portal must be secured by using multi-factor authentication.
The testing of WebApp1 updates must not be visible to anyone outside the company.

Question
You need to recommend a strategy for migrating the database content of WebApp1 to Azure.
What should you include in the recommendation?

A. Use Azure Site Recovery to replicate the SQL servers to Azure.

B. Copy the BACPAC le that contains the Azure SQL database les to Azure Blob storage.

C. Use SQL Server transactional replication.

D. Copy the VHD that contains the Azure SQL database les to Azure Blob storage.

Correct Answer: D
Before you upload a Windows virtual machine (VM) from on-premises to Azure, you must prepare the virtual hard disk (VHD or VHDX).
Scenario: WebApp1 has a web tier that uses Microsoft Internet Information Services (IIS) and a database tier that runs Microsoft SQL Server
2016. The web tier and the database tier are deployed to virtual machines that run on Hyper-V.
Reference:
https://docs.microsoft.com/en-us/azure/virtual-machines/windows/prepare-for-upload-vhd-image

  speedminer 5 months ago


Seems like a better link to reference for this scenario
https://docs.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/migrate-to-vm-from-sql-server
upvoted 1 times

  spam4pl 4 months, 4 weeks ago


One of technical requirements states following:
"Whenever possible, solutions must be deployed to Azure by using the Standard pricing tier of Azure App Service."
This suggest usage of Use SQL Server transactional replication (point C) to Azure App Service instead of VM with SQL (point D).
upvoted 5 times

  Rickii 4 months, 3 weeks ago


But as the requirement is to bring own license the answer is D
upvoted 1 times

  nexnexnex 3 weeks, 1 day ago


Hybrid benefit allows reduced prices for any SQL deployment
upvoted 1 times

  Az_Sasi 4 months, 2 weeks ago


Looks like answer C should be correct for the given scenario -
Req from scenario - To avoid disrupting customer access, database downtime must be minimized when databases are migrated.

https://docs.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/migrate-to-vm-from-sql-server
Choose a migration method
For best data transfer performance, migrate the database files into the Azure VM using a compressed backup file.

To minimize downtime during the database migration process, use either the AlwaysOn option or the transactional replication option.
upvoted 11 times

  NickGR 1 month, 2 weeks ago


You are correct. For this scenario to achive zero downtime or just to keep it to minimum you need transactional replication.
Here is a great explaination:
https://azure.microsoft.com/en-gb/blog/migrating-to-azure-sql-database-with-zero-downtime-for-read-only-workloads/
upvoted 2 times

  multcloud 4 months, 1 week ago


C is the answer, refer az 301 question
upvoted 14 times

  Yamaneko 3 months, 2 weeks ago


C is correct
https://docs.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/migrate-to-vm-from-sql-server#choose-a-migration-method
https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 193/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

upvoted 7 times

  rhr 3 months, 1 week ago


The correct answer is D

Use when bringing your own SQL Server license, when migrating a database that you'll run on an older version of SQL Server, or when migrating
system and user databases together as part of the migration of database dependent on other user databases and/or system databases.
upvoted 2 times

  Edgecrusher77 3 months ago


I would go for C. Use SQL Server transactional replication
upvoted 1 times

  pentum7 2 months, 4 weeks ago


"For best data transfer performance, migrate the database files into the Azure VM using a compressed backup file.

To minimize downtime during the database migration process, use either the AlwaysOn option or the transactional replication option."
upvoted 3 times

  cor 2 months, 3 weeks ago


D: Convert to a VM, upload to a URL, and deploy as a new VM
Use this method to migrate all system and user databases in an on-premises SQL Server instance to an Azure virtual machine. Use the following
general steps to migrate an entire SQL Server instance using this manual method:

Convert physical or virtual machines to Hyper-V VHDs.


Upload VHD files to Azure Storage by using the Add-AzureVHD cmdlet.
Deploy a new virtual machine by using the uploaded VHD.
https://docs.microsoft.com/en-us/azure/azure-sql/virtual-machines/windows/migrate-to-vm-from-sql-server#choose-a-migration-method
upvoted 2 times

  Hanger_Man 2 months, 1 week ago


Correct Answer-'C'
upvoted 2 times

  wgre 1 month, 1 week ago


Why no ASR? it supports SQL
upvoted 1 times

  ArifUK 1 month ago


very good question, why are we not considering ASR as the migration method here?
answer C suggests setting up the transactional replication, which may take multiple change window.

answer D is plain old style, IMHO.


upvoted 1 times

  rizabeer 3 weeks, 1 day ago


Option D is an offline option. That is Downtime so it is incorrect. The only Onliner optioins I know of are: Always ON and add target VM as replica
OR Use SQL server transactional replication using on-prem VM as the publisher and target VM as the subscriber. Since we have transactional
replication option, that is C. So thats the correct answer.
upvoted 2 times

  Blaaa 2 weeks, 4 days ago


C is correct
upvoted 3 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 194/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #2 Topic 11

Introductory Info
Case Study -
This is a case study. Case studies are not timed separately. You can use as much exam time as you would like to complete each case. However,
there may be additional case studies and sections on this exam. You must manage your time to ensure that you are able to complete all questions
included on this exam in the time provided.
To answer the questions included in a case study, you will need to reference information that is provided in the case study. Case studies might
contain exhibits and other resources that provide more information about the scenario that is described in the case study. Each question is
independent of the other questions in this case study.
At the end of this case study, a review screen will appear. This screen allows you to review your answers and to make changes before you move to
the next section of the exam. After you begin a new section, you cannot return to this section.

To start the case study -


To display the rst question in this case study, click the Next button. Use the buttons in the left pane to explore the content of the case study
before you answer the questions. Clicking these buttons displays information such as business requirements, existing environment, and problem
statements. If the case study has an All Information tab, note that the information displayed is identical to the information displayed on the
subsequent tabs. When you are ready to answer a question, click the Question button to return to the question.
Existing Environment. Active Directory Environment
The network contains two Active Directory forests named corp.fabrikam.com and rd.fabrikam.com. There are no trust relationships between the
forests.
Corp.fabrikam.com is a production forest that contains identities used for internal user and computer authentication.
Rd.fabrikam.com is used by the research and development (R&D) department only.
Existing Environment. Network Infrastructure
Each o ce contains at least one domain controller from the corp.fabrikam.com domain. The main o ce contains all the domain controllers for
the rd.fabrikam.com forest.
All the o ces have a high-speed connection to the Internet.
An existing application named WebApp1 is hosted in the data center of the London o ce. WebApp1 is used by customers to place and track
orders. WebApp1 has a web tier that uses Microsoft Internet Information Services (IIS) and a database tier that runs Microsoft SQL Server 2016.
The web tier and the database tier are deployed to virtual machines that run on Hyper-V.
The IT department currently uses a separate Hyper-V environment to test updates to WebApp1.
Fabrikam purchases all Microsoft licenses through a Microsoft Enterprise Agreement that includes Software Assurance.
Existing Environment. Problem Statements
The use of WebApp1 is unpredictable. At peak times, users often report delays. At other times, many resources for WebApp1 are underutilized.

Requirements. Planned Changes -


Fabrikam plans to move most of its production workloads to Azure during the next few years.
As one of its rst projects, the company plans to establish a hybrid identity model, facilitating an upcoming Microsoft O ce 365 deployment.
All R&D operations will remain on-premises.
Fabrikam plans to migrate the production and test instances of WebApp1 to Azure and to use the S1 plan.
Requirements. Technical Requirements
Fabrikam identi es the following technical requirements:
Web site content must be easily updated from a single point.
User input must be minimized when provisioning new web app instances.
Whenever possible, existing on-premises licenses must be used to reduce cost.
Users must always authenticate by using their corp.fabrikam.com UPN identity.
Any new deployments to Azure must be redundant in case an Azure region fails.
Whenever possible, solutions must be deployed to Azure by using the Standard pricing tier of Azure App Service.
An email distribution group named IT Support must be noti ed of any issues relating to the directory synchronization services.
Directory synchronization between Azure Active Directory (Azure AD) and corp.fabrikam.com must not be affected by a link failure between Azure
and the on- premises network.
Requirements. Database Requirements
Fabrikam identi es the following database requirements:
Database metrics for the production instance of WebApp1 must be available for analysis so that database administrators can optimize the
performance settings.
To avoid disrupting customer access, database downtime must be minimized when databases are migrated.
Database backups must be retained for a minimum of seven years to meet compliance requirements.

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 195/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Requirements. Security Requirements


Fabrikam identi es the following security requirements:
Company information including policies, templates, and data must be inaccessible to anyone outside the company.
Users on the on-premises network must be able to authenticate to corp.fabrikam.com if an Internet link fails.
Administrators must be able authenticate to the Azure portal by using their corp.fabrikam.com credentials.
All administrative access to the Azure portal must be secured by using multi-factor authentication.
The testing of WebApp1 updates must not be visible to anyone outside the company.

Question
You need to recommend a noti cation solution for the IT Support distribution group.
What should you include in the recommendation?

A. a SendGrid account with advanced reporting

B. Azure AD Connect Health

C. Azure Network Watcher

D. an action group

Correct Answer: B
Reference:
https://docs.microsoft.com/en-us/azure/active-directory/hybrid/how-to-connect-health-operations

  mchint01 3 months, 4 weeks ago


Correct
upvoted 4 times

  Hanger_Man 2 months, 1 week ago


correct
upvoted 3 times

  Blaaa 2 weeks, 4 days ago


Correct
upvoted 1 times

Topic 12 - More Questions.

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 196/197
2/5/2021 AZ-304 Exam – Free Actual Q&As, Page 1 | ExamTopics

Question #1 Topic 12

You plan to deploy an application named App1 that will run on ve Azure virtual machines. Additional virtual machines will be deployed later to run
App1.
You need to recommend a solution to meet the following requirements for the virtual machines that will run App1:
✑ Ensure that the virtual machines can authenticate to Azure Active Directory (Azure AD) to gain access to an Azure key vault, Azure Logic Apps
instances, and an Azure SQL database.
✑ Avoid assigning new roles and permissions for Azure services when you deploy additional virtual machines.
✑ Avoid storing secrets and certi cates on the virtual machines.
Which type of identity should you include in the recommendation?

A. a service principal that is con gured to use a certi cate

B. a system-assigned managed identity

C. a service principal that is con gured to use a client secret

D. a user-assigned managed identity

Correct Answer: D
Managed identities for Azure resources is a feature of Azure Active Directory.
User-assigned managed identity can be shared. The same user-assigned managed identity can be associated with more than one Azure
resource.
Incorrect Answers:
B: System-assigned managed identity cannot be shared. It can only be associated with a single Azure resource.
Reference:
https://docs.microsoft.com/en-us/azure/active-directory/managed-identities-azure-resources/overview

  Tombarc 5 months, 1 week ago


correct answer
upvoted 7 times

  razvi 5 months ago


A and C are wrong because they are excluded with the text "Avoid storing secrets and certificates on the virtual machines" in the question.
upvoted 6 times

  levianthan 4 months, 2 weeks ago


And B is wrong because you need the same identity for all VMs.
upvoted 5 times

  MRBEAST 3 months ago


correct
upvoted 3 times

  HST_ENG 2 months, 2 weeks ago


https://docs.microsoft.com/en-us/azure/active-directory/managed-identities-azure-resources/overview

Answer is correct...please check table with heading "The table below shows the differences between the two types of managed identities."
upvoted 1 times

  Blaaa 2 weeks, 4 days ago


Correct
upvoted 1 times

https://www.examtopics.com/exams/microsoft/az-304/custom-view/ 197/197

You might also like