AI-100.prepaway.premium.2923.exam.156q_TLkzTyY
AI-100.prepaway.premium.2923.exam.156q_TLkzTyY
AI-100.prepaway.premium.2923.exam.156q_TLkzTyY
156q
Number: AI-100
Passing Score: 800
Time Limit: 120 min
File Version: 7.0
AI-100
Version 7.0
FF3B4C4270CB86445E2DB637EF5B166D
Analyze solution requirements
Question Set 1
QUESTION 1
HOTSPOT
You are designing an application to parse images of business forms and upload the data to a database.
The upload process will occur once a week.
You need to recommend which services to use for the application. The solution must minimize
infrastructure costs.
Which services should you recommend? To answer, select the appropriate options in the answer area.
Hot Area:
Correct Answer:
FF3B4C4270CB86445E2DB637EF5B166D
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Box 1: Azure Cognitive Services
Azure Cognitive Services include image-processing algorithms to smartly identify, caption, index, and
moderate your pictures and videos.
Not: Azure Linguistic Analytics API, which provides advanced natural language processing over raw text.
It provides access to on-premises data in SQL Server and cloud data in Azure Storage (Blob and Tables)
and Azure SQL Database.
References:
https://azure.microsoft.com/en-us/services/cognitive-services/
https://www.jamesserra.com/archive/2014/11/what-is-azure-data-factory/
QUESTION 2
HOTSPOT
You plan to deploy an Azure Data Factory pipeline that will perform the following:
You need to recommend which technologies the pipeline should use. The solution must minimize custom
code.
What should you include in the recommendation? To answer, select the appropriate options in the answer
area.
Hot Area:
Correct Answer:
FF3B4C4270CB86445E2DB637EF5B166D
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Not Azure-SSIS Integration Runtime, as you would need to write custom code.
Incorrect:
Not Azure API Management: Use Azure API Management as a turnkey solution for publishing APIs to
external and internal customers.
References:
https://docs.microsoft.com/en-us/azure/data-factory/concepts-integration-runtime
https://docs.microsoft.com/en-us/azure/logic-apps/logic-apps-examples-and-scenarios
QUESTION 3
HOTSPOT
You need to build an interactive website that will accept uploaded images, and then ask a series of
predefined questions based on each image.
Which services should you use? To answer, select the appropriate options in the answer area.
Hot Area:
FF3B4C4270CB86445E2DB637EF5B166D
Correct Answer:
Section: (none)
Explanation
Explanation/Reference:
Explanation:
References:
https://azure.microsoft.com/en-us/services/cognitive-services/computer-vision/
FF3B4C4270CB86445E2DB637EF5B166D
QUESTION 4
You are designing an AI solution that will analyze millions of pictures by using Azure HDInsight Hadoop
cluster.
You need to recommend a solution for storing the pictures. The solution must minimize costs.
Correct Answer: C
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Data Lake will be a bit more expensive although they are in close range of each other. Blob storage has
more options for pricing depending upon things like how frequently you need to access your data (cold vs
hot storage).
Reference:
http://blog.pragmaticworks.com/azure-data-lake-vs-azure-blob-storage-in-data-warehousing
QUESTION 5
You are configuring data persistence for a Microsoft Bot Framework application. The application requires a
structured NoSQL cloud data store.
You need to identify a storage solution for the application. The solution must minimize costs.
Correct Answer: D
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Table Storage is a NoSQL key-value store for rapid development using massive semi-structured datasets
You can develop applications on Cosmos DB using popular NoSQL APIs.
While Azure Storage Tables is aimed at high capacity on a single region (optional secondary read only
region but no failover), indexing by PK/RK and storage-optimized pricing; Azure Cosmos DB Tables aims
for high throughput (single-digit millisecond latency), global distribution (multiple failover), SLA-backed
predictive performance with automatic indexing of each attribute/property and a pricing model focused on
throughput.
References:
https://db-engines.com/en/system/Microsoft+Azure+Cosmos+DB%3BMicrosoft+Azure+Table+Storage
QUESTION 6
You have an Azure Machine Learning model that is deployed to a web service.
FF3B4C4270CB86445E2DB637EF5B166D
You plan to publish the web service by using the name ml.contoso.com.
You need to recommend a solution to ensure that access to the web service is encrypted.
Which three actions should you recommend? Each correct answer presents part of the solution.
Explanation/Reference:
The process of securing a new web service or an existing one is as follows:
Note: To deploy (or re-deploy) the service with SSL enabled, set the ssl_enabled parameter to True,
wherever applicable. Set the ssl_certificate parameter to the value of the certificate file and the ssl_key to
the value of the key file.
References:
https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-secure-web-service
QUESTION 7
Your company recently deployed several hardware devices that contain sensors.
The sensors generate new data on an hourly basis. The data generated is stored on-premises and retained
for several years.
During the past two months, the sensors generated 300 GB of data.
You plan to move the data to Azure and then perform advanced analytics on the data.
Correct Answer: C
Section: (none)
Explanation
Explanation/Reference:
References:
https://docs.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/data-storage
FF3B4C4270CB86445E2DB637EF5B166D
QUESTION 8
You plan to design an application that will use data from Azure Data Lake and perform sentiment analysis
by using Azure Machine Learning algorithms.
The developers of the application use a mix of Windows- and Linux-based environments. The developers
contribute to shared GitHub repositories.
You need all the developers to use the same tool to develop the application.
What is the best tool to use? More than one answer choice may achieve the goal.
Correct Answer: C
Section: (none)
Explanation
Explanation/Reference:
References:
https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/machine-learning/studio/algorithm-
choice.md
QUESTION 9
You have several AI applications that use an Azure Kubernetes Service (AKS) cluster. The cluster supports
a maximum of 32 nodes.
You discover that occasionally and unpredictably, the application requires more than 32 nodes.
Correct Answer: AB
Section: (none)
Explanation
Explanation/Reference:
Explanation:
B: To keep up with application demands in Azure Kubernetes Service (AKS), you may need to adjust the
number of nodes that run your workloads. The cluster autoscaler component can watch for pods in your
cluster that can't be scheduled because of resource constraints. When issues are detected, the number of
nodes is increased to meet the application demand. Nodes are also regularly checked for a lack of running
pods, with the number of nodes then decreased as needed. This ability to automatically scale up or down
the number of nodes in your AKS cluster lets you run an efficient, cost-effective cluster.
A: You can also use the horizontal pod autoscaler to automatically adjust the number of pods that run your
application.
Reference:
https://docs.microsoft.com/en-us/azure/aks/cluster-autoscaler
QUESTION 10
You deploy an infrastructure for a big data workload.
FF3B4C4270CB86445E2DB637EF5B166D
You need to run Azure HDInsight and Microsoft Machine Learning Server. You plan to set the RevoScaleR
compute contexts to run rx function calls in parallel.
What are three compute contexts that you can use for Machine Learning Server? Each correct answer
presents a complete solution.
A. SQL
B. Spark
C. local parallel
D. HBase
E. local sequential
Explanation/Reference:
Explanation:
Remote computing is available for specific data sources on selected platforms. The following tables
document the supported combinations.
RxInSqlServer, sqlserver: Remote compute context. Target server is a single database node (SQL
Server 2016 R Services or SQL Server 2017 Machine Learning Services). Computation is parallel, but
not distributed.
RxSpark, spark: Remote compute context. Target is a Spark cluster on Hadoop.
RxLocalParallel, localpar: Compute context is often used to enable controlled, distributed computations
relying on instructions you provide rather than a built-in scheduler on Hadoop. You can use compute
context for manual distributed computing.
References:
https://docs.microsoft.com/en-us/machine-learning-server/r/concept-what-is-compute-context
QUESTION 11
Your company has 1,000 AI developers who are responsible for provisioning environments in Azure.
You need to control the type, size, and location of the resources that the developers can provision.
Correct Answer: E
Section: (none)
Explanation
Explanation/Reference:
Explanation:
When an application needs access to deploy or configure resources through Azure Resource Manager in
Azure Stack, you create a service principal, which is a credential for your application. You can then delegate
only the necessary permissions to that service principal.
References:
https://docs.microsoft.com/en-us/azure/azure-stack/azure-stack-create-service-principals
QUESTION 12
You are designing an AI solution in Azure that will perform image classification.
FF3B4C4270CB86445E2DB637EF5B166D
You need to identify which processing platform will provide you with the ability to update the logic over time.
The solution must have the lowest latency for inferencing without having to batch.
Correct Answer: B
Section: (none)
Explanation
Explanation/Reference:
Explanation:
FPGAs, such as those available on Azure, provide performance close to ASICs. They are also flexible and
reconfigurable over time, to implement new logic.
Incorrect Answers:
D: ASICs are custom circuits, such as Google's TensorFlow Processor Units (TPU), provide the highest
efficiency. They can't be reconfigured as your needs change.
References:
https://docs.microsoft.com/en-us/azure/machine-learning/service/concept-accelerate-with-fpgas
QUESTION 13
You have a solution that runs on a five-node Azure Kubernetes Service (AKS) cluster. The cluster uses an
N-series virtual machine.
You need to recommend a solution to maintain the cluster configuration when the cluster is not in use. The
solution must not incur any compute costs.
Correct Answer: A
Section: (none)
Explanation
Explanation/Reference:
Explanation:
An AKS cluster has one or more nodes.
References:
https://docs.microsoft.com/en-us/azure/aks/concepts-clusters-workloads
QUESTION 14
HOTSPOT
You are designing an AI solution that will be used to find buildings in aerial pictures.
Users will upload the pictures to an Azure Storage account. A separate JSON document will contain for the
pictures.
FF3B4C4270CB86445E2DB637EF5B166D
Store metadata for the pictures in a data store.
Run a custom vision Azure Machine Learning module to identify the buildings in a picture and the
position of the buildings’ edges.
Run a custom mathematical module to calculate the dimensions of the buildings in a picture based on
the metadata and data from the vision module.
You need to identify which Azure infrastructure services are used for each component of the AI workflow.
The solution must execute as quickly as possible.
What should you identify? To answer, select the appropriate options in the answer area.
Hot Area:
Correct Answer:
FF3B4C4270CB86445E2DB637EF5B166D
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Box 2: NV
The NV-series enables powerful remote visualisation workloads and other graphics-intensive applications
backed by the NVIDIA Tesla M60 GPU.
Note: The N-series is a family of Azure Virtual Machines with GPU capabilities. GPUs are ideal for compute
and graphics-intensive workloads, helping customers to fuel innovation through scenarios like high-end
remote visualisation, deep learning and predictive analytics.
Box 3: F
F-series VMs feature a higher CPU-to-memory ratio. Example use cases include batch processing, web
servers, analytics and gaming.
Incorrect:
A-series VMs have CPU performance and memory configurations best suited for entry level workloads like
development and test.
References:
https://azure.microsoft.com/en-in/pricing/details/virtual-machines/series/
QUESTION 15
Your company has recently deployed 5,000 Internet-connected sensors for a planned AI solution.
You need to recommend a computing solution to perform a real-time analysis of the data generated by the
sensors.
FF3B4C4270CB86445E2DB637EF5B166D
B. Azure Notification Hubs
C. an Azure HDInsight Hadoop cluster
D. an Azure HDInsight R cluster
Correct Answer: A
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Azure HDInsight makes it easy, fast, and cost-effective to process massive amounts of data.
You can use HDInsight to process streaming data that's received in real time from a variety of devices.
References:
https://docs.microsoft.com/en-us/azure/hdinsight/hadoop/apache-hadoop-introduction
QUESTION 16
HOTSPOT
Note: This question is part of a series of questions that present the same scenario. Each question
in the series contains a unique solution that might meet the stated goals. Some question sets might
have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will
not appear in the review screen.
You plan to deploy an application that will perform image recognition. The application will store image data
in two Azure Blob storage stores named Blob1 and Blob2.
You need to recommend a security solution that meets the following requirements:
What should you recommend using to control access to each blob store? To answer, select the appropriate
options in the answer area.
Hot Area:
FF3B4C4270CB86445E2DB637EF5B166D
Correct Answer:
FF3B4C4270CB86445E2DB637EF5B166D
Section: (none)
Explanation
Explanation/Reference:
References:
https://docs.microsoft.com/en-us/azure/storage/common/storage-auth
QUESTION 17
You deploy an application that performs sentiment analysis on the data stored in Azure Cosmos DB.
Recently, you loaded a large amount of data to the database. The data was for a customer named Contoso,
Ltd.
You discover that queries for the Contoso data are slow to complete, and the queries slow the entire
application.
You need to reduce the amount of time it takes for the queries to complete. The solution must minimize
costs.
What is the best way to achieve the goal? More than one answer choice may achieve the goal. Select the
BEST answer.
FF3B4C4270CB86445E2DB637EF5B166D
D. Migrate the data to the Cosmos DB database.
Correct Answer: B
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Throughput provisioned for a container is divided evenly among physical partitions.
Incorrect:
Not A: Increasing request units would also improve throughput, but at a cost.
Reference:
https://docs.microsoft.com/en-us/azure/architecture/best-practices/data-partitioning
QUESTION 18
You have an AI application that uses keys in Azure Key Vault.
Recently, a key used by the application was deleted accidentally and was unrecoverable.
You need to ensure that if a key is deleted, it is retained in the key vault for 90 days.
Which two features should you configure? Each correct answer presents part of the solution.
Correct Answer: BC
Section: (none)
Explanation
Explanation/Reference:
References:
https://docs.microsoft.com/en-us/azure/architecture/best-practices/data-partitioning
QUESTION 19
DRAG DROP
You are designing an AI solution that will analyze media data. The data will be stored in Azure Blob storage.
You need to ensure that the storage account is encrypted by using a key generated by the hardware
security module (HSM) of your company.
Which three actions should you perform in sequence? To answer, move the appropriate actions from the
list of actions to the answer area and arrange them in the correct order.
FF3B4C4270CB86445E2DB637EF5B166D
Correct Answer:
FF3B4C4270CB86445E2DB637EF5B166D
Section: (none)
Explanation
Explanation/Reference:
References:
https://docs.microsoft.com/en-us/azure/storage/common/storage-encryption-keys-portal
https://docs.microsoft.com/en-us/azure/key-vault/key-vault-hsm-protected-keys
QUESTION 20
You plan to implement a new data warehouse for a planned AI solution.
What two solutions should you include in the recommendation? Each correct answer is a complete solution.
A. Apache Hadoop
B. Apache Spark
C. A Microsoft Azure SQL database
D. An Azure virtual machine that runs Microsoft SQL Server
FF3B4C4270CB86445E2DB637EF5B166D
Correct Answer: AB
Section: (none)
Explanation
Explanation/Reference:
QUESTION 21
You need to build a solution to monitor Twitter. The solution must meet the following requirements:
Send an email message to the marketing department when negative Twitter messages are detected.
Run sentiment analysis on Twitter messages that mention specific tags.
Use the least amount of custom code possible.
Which two services should you include in the solution? Each correct answer presents part of the solution.
A. Azure Databricks
B. Azure Stream Analytics
C. Azure Functions
D. Azure Cognitive Services
E. Azure Logic Apps
Correct Answer: BE
Section: (none)
Explanation
Explanation/Reference:
References:
https://docs.microsoft.com/en-us/azure/stream-analytics/streaming-technologies
https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-twitter-sentiment-analysis-trends
QUESTION 22
HOTSPOT
You need to configure security for an Azure Machine Learning service used by groups of data scientists.
The groups must have access to only their own experiments and must be able to grant permissions to the
members of their team.
What should you do? To answer, select the appropriate options in the answer area.
Hot Area:
FF3B4C4270CB86445E2DB637EF5B166D
Correct Answer:
Section: (none)
Explanation
Explanation/Reference:
References:
https://docs.microsoft.com/en-us/machine-learning-server/operationalize/configure-roles#how-are-roles-
assigned
https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-assign-roles
QUESTION 23
You plan to build an application that will perform predictive analytics. Users will be able to consume the
application data by using Microsoft Power BI or a custom website.
FF3B4C4270CB86445E2DB637EF5B166D
Which auditing solution should you use?
Correct Answer: D
Section: (none)
Explanation
Explanation/Reference:
References:
https://docs.microsoft.com/en-us/azure/active-directory/reports-monitoring/concept-audit-logs
QUESTION 24
HOTSPOT
You need to build a sentiment analysis solution that will use input data from JSON documents and PDF
documents. The JSON documents must be processed in batches and aggregated.
Which storage type should you use for each file type? To answer, select the appropriate options in the
answer area.
Hot Area:
Correct Answer:
FF3B4C4270CB86445E2DB637EF5B166D
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Data storage
Azure Storage Blob Containers. Many existing Azure business processes already use Azure blob
storage, making this a good choice for a big data store.
Azure Data Lake Store. Azure Data Lake Store offers virtually unlimited storage for any size of file, and
extensive security options, making it a good choice for extremely large-scale big data solutions that
require a centralized store for data in heterogeneous formats.
References:
https://docs.microsoft.com/en-us/azure/architecture/data-guide/big-data/batch-processing
https://docs.microsoft.com/bs-latn-ba/azure/storage/blobs/storage-blobs-introduction
QUESTION 25
You are developing a mobile application that will perform optical character recognition (OCR) from photos.
The application will annotate the photos by using metadata, store the photos in Azure Blob storage, and
then score the photos by using an Azure Machine Learning model.
FF3B4C4270CB86445E2DB637EF5B166D
Correct Answer: B
Section: (none)
Explanation
Explanation/Reference:
Explanation:
By using Azure services such as the Computer Vision API and Azure Functions, companies can eliminate
the need to manage individual servers, while reducing costs and leveraging the expertise that Microsoft has
already developed around processing images with Cognitive Services.
Incorrect:
Not E: The Azure Batch AI service was retired in 2019 and was replaced with Azure Machine Learning
Compute.
References: https://docs.microsoft.com/en-us/azure/architecture/example-scenario/ai/intelligent-apps-
image-processing
QUESTION 26
You create an Azure Cognitive Services resource.
A data scientist needs to call the resource from Azure Logic Apps by using the generic HTTP connector.
Which two values should you provide to the data scientist? Each correct answer presents part of the
solution.
A. Endpoint URL
B. Resource name
C. Access key
D. Resource group name
E. Subscription ID
Correct Answer: AC
Section: (none)
Explanation
Explanation/Reference:
References:
https://social.technet.microsoft.com/wiki/contents/articles/36074.logic-apps-with-azure-cognitive-
service.aspx
QUESTION 27
You plan to deploy an AI solution that tracks the behavior of 10 custom mobile apps. Each mobile app has
several thousand users.
You need to recommend a solution for real-time data ingestion for the data originating from the mobile app
users.
Correct Answer: A
Section: (none)
Explanation
Explanation/Reference:
FF3B4C4270CB86445E2DB637EF5B166D
References:
https://docs.microsoft.com/en-in/azure/event-hubs/event-hubs-about
QUESTION 28
You plan to deploy Azure IoT Edge devices that will each store more than 10,000 images locally and
classify the images by using a Custom Vision Service classifier.
You need to ensure that the images persist on the devices for 14 days.
Correct Answer: B
Section: (none)
Explanation
Explanation/Reference:
References:
https://docs.microsoft.com/en-us/azure/iot-edge/how-to-store-data-blob
QUESTION 29
Your company is building custom models that integrate into microservices architecture on Azure
Kubernetes Services (AKS).
You need to update the model and enable Azure Application Insights for the model.
Correct Answer: C
Section: (none)
Explanation
Explanation/Reference:
Explanation:
You can set up Azure Application Insights for Azure Machine Learning. Application Insights gives you the
opportunity to monitor:
Requirements include an Azure Machine Learning workspace, a local directory that contains your scripts,
and the Azure Machine Learning SDK for Python installed.
References:
https://docs.microsoft.com/bs-latn-ba/azure/machine-learning/service/how-to-enable-app-insights
QUESTION 30
You are designing an AI solution that will analyze millions of pictures by using Azure HDInsight Hadoop
FF3B4C4270CB86445E2DB637EF5B166D
cluster.
You need to recommend a solution for storing the pictures. The solution must minimize costs.
Correct Answer: D
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Azure Data Lake Storage Gen1 is adequate and less expensive compared to Gen2.
References:
https://visualbi.com/blogs/microsoft/introduction-azure-data-lake-gen2/
QUESTION 31
You deploy an application that performs sentiment analysis on the data stored in Azure Cosmos DB.
Recently, you loaded a large amount of data to the database. The data was for a customer named Contoso,
Ltd.
You discover that queries for the Contoso data are slow to complete, and the queries slow the entire
application.
You need to reduce the amount of time it takes for the queries to complete. The solution must minimize
costs.
What should you do? More than one answer choice may achieve the goal. (Choose two.)
Correct Answer: AB
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Increasing request units would improve throughput, but at a cost.
References:
https://docs.microsoft.com/en-us/azure/architecture/best-practices/data-partitioning
QUESTION 32
Your company has several AI solutions and bots.
You need to implement a solution to monitor the utilization of the bots. The solution must ensure that
analysts at the company can generate dashboards to review the utilization.
FF3B4C4270CB86445E2DB637EF5B166D
A. Azure Application Insights
B. Azure Data Explorer
C. Azure Logic Apps
D. Azure Monitor
Correct Answer: A
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Bot Analytics.
Analytics is an extension of Application Insights. Application Insights provides service-level and
instrumentation data like traffic, latency, and integrations. Analytics provides conversation-level reporting on
user, message, and channel data.
References:
https://docs.microsoft.com/en-us/azure/bot-service/bot-service-manage-analytics
QUESTION 33
Your plan to design a bot that will be hosted by using Azure Bot Service.
Your company identifies the following compliance requirements for the bot:
You need to identify which compliance requirements are met by hosting the bot in the bot service.
Correct Answer: B
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Azure Bot service is compliant with ISO 27001:2013, ISO 27019:2014, SOC 1 and 2, Payment Card
Industry Data Security Standard (PCI DSS), and Health Insurance Portability and Accountability Act
Business Associate Agreement (HIPAA BAA).
Microsoft products and services, including Azure Bot Service, are available today to help you meet the
GDPR requirements.
References:
https://docs.microsoft.com/en-us/azure/bot-service/bot-service-compliance
https://blog.botframework.com/2018/04/23/general-data-protection-regulation-gdpr/
QUESTION 34
HOTSPOT
You plan to use Azure Cognitive Services to provide the development team at your company with the ability
to create intelligent apps without having direct AI or data science skills.
The company identifies the following requirements for the planned Cognitive Services deployment:
FF3B4C4270CB86445E2DB637EF5B166D
Provide support for the following languages: English, Portuguese, and German.
Perform text analytics to derive a sentiment score.
Which Cognitive Service service should you deploy for each requirement? To answer, select the
appropriate options in the answer area.
Hot Area:
Correct Answer:
Section: (none)
Explanation
Explanation/Reference:
Explanation:
FF3B4C4270CB86445E2DB637EF5B166D
The Language Detection feature of the Azure Text Analytics REST API evaluates text input for each
document and returns language identifiers with a score that indicates the strength of the analysis.
References:
https://docs.microsoft.com/en-us/azure/cognitive-services/text-analytics/how-tos/text-analytics-how-to-
language-detection
https://docs.microsoft.com/en-us/azure/azure-databricks/databricks-sentiment-analysis-cognitive-services
QUESTION 35
HOTSPOT
You plan to deploy the Text Analytics and Computer Vision services. The Azure Cognitive Services will be
deployed to the West US and East Europe Azure regions.
You need to identify the minimum number of service endpoints and API keys required for the planned
deployment.
What should you identify? To answer, select the appropriate options in the answer area.
Hot Area:
Correct Answer:
FF3B4C4270CB86445E2DB637EF5B166D
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Box 1: 2
After creating a Cognitive Service resource in the Azure portal, you'll get an endpoint and a key for
authenticating your applications. You can access Azure Cognitive Services through two different resources:
A multi-service resource, or a single-service one.
Multi-service resource: Access multiple Azure Cognitive Services with a single key and endpoint.
Note: You need a key and endpoint for a Text Analytics resource. Azure Cognitive Services are represented
by Azure resources that you subscribe to.
Each request must include your access key and an HTTP endpoint. The endpoint specifies the region you
chose during sign up, the service URL, and a resource used on the request
Box 2: 2
You need at least one key per region.
References:
https://docs.microsoft.com/en-us/azure/cognitive-services/cognitive-services-apis-create-account
QUESTION 36
Your company plans to create a mobile app that will be used by employees to query the employee
handbook.
You need to ensure that the employees can query the handbook by typing or by using speech.
Correct Answer: D
Section: (none)
Explanation
FF3B4C4270CB86445E2DB637EF5B166D
Explanation/Reference:
Explanation:
Azure Cognitive Search (formerly known as "Azure Search") is a search-as-a-service cloud solution that
gives developers APIs and tools for adding a rich search experience over private, heterogeneous content in
web, mobile, and enterprise applications. Your code or a tool invokes data ingestion (indexing) to create
and load an index. Optionally, you can add cognitive skills to apply AI processes during indexing. Doing so
can add new information and structures useful for search and other scenarios.
Incorrect Answres:
B: QnA Maker is a cloud-based API service that lets you create a conversational question-and-answer layer
over your existing data. Use it to build a knowledge base by extracting questions and answers from your
semi-structured content, including FAQs, manuals, and documents. Answer users’ questions with the best
answers from the QnAs in your knowledge base—automatically.
References:
https://docs.microsoft.com/en-us/azure/search/search-what-is-azure-search
QUESTION 37
You have an existing Language Understanding (LUIS) model for an internal bot.
You need to recommend a solution to add a meeting reminder functionality to the bot by using a prebuilt
model. The solution must minimize the size of the model.
A. domain
B. intents
C. entities
Correct Answer: A
Section: (none)
Explanation
Explanation/Reference:
Explanation:
LUIS includes a set of prebuilt entities for recognizing common types of information, like dates, times,
numbers, measurements, and currency. Prebuilt entity support varies by the culture of your LUIS app.
Note: LUIS provides three types of prebuilt models. Each model can be added to your app at any time.
References:
https://docs.microsoft.com/en-us/azure/cognitive-services/luis/luis-concept-prebuilt-model
QUESTION 38
You have an on-premises repository that contains 5,000 videos. The videos feature demonstrations of the
products sold by your company.
The company’s customers plan to search the videos by using the name of the product demonstrated in
each video.
FF3B4C4270CB86445E2DB637EF5B166D
D. Deploy a Custom Vision API service.
Correct Answer: A
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Azure Media Services can be used to encode and package content, stream videos on-demand, broadcast
live, analyze your videos with Media Services v3.
You can snalyze recorded videos or audio content. For example, to achieve higher customer satisfaction,
organizations can extract speech-to-text and build search indexes and dashboards. Then, they can extract
intelligence around common complaints, sources of complaints, and other relevant data.
References:
https://docs.microsoft.com/en-us/azure/media-services/latest/media-services-overview
QUESTION 39
Your company manages a sports team.
The company sets up a video booth to record messages for the team.
Before replaying the messages on a video screen, you need to generate captions for the messages and
check the sentiment of the video to ensure that only positive messages are played.
Correct Answer: D
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Video Indexer includes Audio transcription: Converts speech to text in 12 languages and allows extensions.
Supported languages include English, Spanish, French, German, Italian, Mandarin Chinese, Japanese,
Arabic, Russian, Portuguese, Hindi, and Korean.
When indexing by one channel, partial result for those models will be available, such as sentiment analysis:
Identifies positive, negative, and neutral sentiments from speech and visual text.
Reference:
https://docs.microsoft.com/en-us/azure/media-services/video-indexer/video-indexer-overview
QUESTION 40
HOTSPOT
You plan to build an app that will provide users with the ability to dictate messages and convert the
messages into text.
You need to recommend a solution to meet the following requirements for the app:
Must be able to transcribe streaming dictated messages that are longer than 15 seconds.
Must be able to upload existing recordings to Azure Blob storage to be transcribed later.
Which solution should you recommend for each requirement? To answer, select the appropriate options in
the answer area.
FF3B4C4270CB86445E2DB637EF5B166D
Hot Area:
Correct Answer:
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Reference:
https://github.com/Azure-Samples/cognitive-services-speech-sdk/issues/13
https://docs.microsoft.com/en-us/azure/cognitive-services/speech-service/batch-transcription
QUESTION 41
Your company plans to monitor twitter hashtags, and then to build a graph of connected people and places
FF3B4C4270CB86445E2DB637EF5B166D
that contains the associated sentiment.
The monitored hashtags use several languages, but the graph will be displayed in English.
You need to recommend the required Azure Cognitive Services endpoints for the planned graph.
Correct Answer: C
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Sentiment analysis, which is also called opinion mining, uses social media analytics tools to determine
attitudes toward a product or idea.
Translator Text: Translate text in real time across more than 60 languages, powered by the latest
innovations in machine translation.
The Key Phrase Extraction skill evaluates unstructured text, and for each record, returns a list of key
phrases. This skill uses the machine learning models provided by Text Analytics in Cognitive Services.
This capability is useful if you need to quickly identify the main talking points in the record. For example,
given input text "The food was delicious and there were wonderful staff", the service returns "food" and
"wonderful staff".
Reference:
https://docs.microsoft.com/en-us/azure/cognitive-services/text-analytics/how-tos/text-analytics-how-to-
entity-linking
https://docs.microsoft.com/en-us/azure/search/cognitive-search-skill-keyphrases
QUESTION 42
HOTSPOT
You plan to create an intelligent bot to handle internal user chats to the help desk of your company. The bot
has the following requirements:
Which solution should you recommend for each requirement? To answer, drag the appropriate solutions to
the correct requirements. Each solution may be used once, more than once, or not at all. You may need to
drag the split bar between panes or scroll to view content.
Hot Area:
FF3B4C4270CB86445E2DB637EF5B166D
Correct Answer:
FF3B4C4270CB86445E2DB637EF5B166D
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Incorrect Answers:
Dispatch tool library:
If a bot uses multiple LUIS models and QnA Maker knowledge bases (knowledge bases), you can use
Dispatch tool to determine which LUIS model or QnA Maker knowledge base best matches the user input.
The dispatch tool does this by creating a single LUIS app to route user input to the correct model.
Reference:
https://docs.microsoft.com/en-us/azure/bot-service/bot-builder-tutorial-dispatch
https://docs.microsoft.com/en-us/azure/cognitive-services/qnamaker/overview/overview
FF3B4C4270CB86445E2DB637EF5B166D
Analyze solution requirements
Testlet 2
Overview
Contoso, Ltd. has an office in New York to serve its North American customers and an office in Paris to
serve its European customers.
Existing Environment
Infrastructure
Each office has a small data center that hosts Active Directory services and a few off-the-shelf software
solutions used by internal users.
The network contains a single Active Directory forest that contains a single domain named contoso.com.
Azure Active Directory (Azure AD) Connect is used to extend identity management to Azure.
The company has an Azure subscription. Each office has an Azure ExpressRoute connection to the
subscription. The New York office connects to a virtual network hosted in the US East 2 Azure region. The
Paris office connects to a virtual network hosted in the West Europe Azure region.
The New York office has an Azure Stack Development Kit (ASDK) deployment that is used for development
and testing.
Contoso has a web app named Bookings hosted in an App Service Environment (ASE). The ASE is in the
virtual network in the East US 2 region. Contoso employees and customers use Bookings to reserve hotel
rooms.
Data Environment
Bookings connects to a Microsoft SQL Server database named hotelDB in the New York office.
The database has a view named vwAvailability that consolidates columns from the tables named Hotels,
Rooms, and RoomAvailability. The database contains data that was collected during the last 20 years.
Problem Statements
Contoso identifies the following issues with its current business model:
European users report that access to Booking is slow, and they lose customers who must wait on the
phone while they search for available rooms.
Users report that Bookings was unavailable during an outage in the New York data center for more than
24 hours.
Requirements
Contoso identifies the following issues with its current business model:
European users report that access to Bookings is slow, and they lose customers who must wait on the
phone while they search for available rooms.
Users report that Bookings was unavailable during on outage in the New York data center for more than
24 hours.
Business Goals
Contoso wants to provide a new version of the Bookings app that will provide a highly available, reliable
service for booking travel packages by interacting with a chatbot named Butler.
Technical requirements
FF3B4C4270CB86445E2DB637EF5B166D
Contoso identifies the following technical requirements:
QUESTION 1
You need to recommend a data storage solution that meets the technical requirements.
What is the best data storage solution to recommend? More than one answer choice may achieve the goal.
Select the BEST answer.
A. Azure Databricks
B. Azure SQL Database
C. Azure Table storage
D. Azure Cosmos DB
Correct Answer: B
Section: (none)
Explanation
Explanation/Reference:
References:
https://docs.microsoft.com/en-us/azure/architecture/example-scenario/ai/commerce-chatbot
FF3B4C4270CB86445E2DB637EF5B166D
Design solutions
Question Set 1
QUESTION 1
You plan to deploy two AI applications named AI1 and AI2. The data for the applications will be stored in a
relational database.
You need to ensure that the users of AI1 and AI2 can see only data in each user’s respective geographic
region. The solution must be enforced at the database level by using row-level security.
Which database solution should you use to store the application data?
Correct Answer: A
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Row-level security is supported by SQL Server, Azure SQL Database, and Azure SQL Data Warehouse.
References:
https://docs.microsoft.com/en-us/sql/relational-databases/security/row-level-security?view=sql-server-2017
QUESTION 2
You are designing an AI workflow that will aggregate data stored in Azure as JSON documents.
You need to choose the data storage service for the data. The solution must minimize costs.
Correct Answer: B
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Generally, Data Lake will be a bit more expensive although they are in close range of each other. Blob
storage has more options for pricing depending upon things like how frequently you need to access your
data (cold vs hot storage). Data Lake is priced on volume, so it will go up as you reach certain tiers of
volume.
References:
http://blog.pragmaticworks.com/azure-data-lake-vs-azure-blob-storage-in-data-warehousing
QUESTION 3
Note: This question is part of a series of questions that present the same scenario. Each question
in the series contains a unique solution that might meet the stated goals. Some question sets might
have more than one correct solution, while others might not have a correct solution.
FF3B4C4270CB86445E2DB637EF5B166D
After you answer a question, you will NOT be able to return to it. As a result, these questions will
not appear in the review screen.
You are developing an application that uses an Azure Kubernetes Service (AKS) cluster.
Solution: You run the kubect1 command, and then you create an SSH connection.
A. Yes
B. No
Correct Answer: B
Section: (none)
Explanation
Explanation/Reference:
QUESTION 4
Your company has a data team of Scala and R experts.
You need to recommend a processing technology to broker messages at scale from Kafka streams to
Azure Storage.
A. Azure Databricks
B. Azure Functions
C. Azure HDInsight with Apache Storm
D. Azure HDInsight with Microsoft Machine Learning Server
Correct Answer: C
Section: (none)
Explanation
Explanation/Reference:
References:
https://docs.microsoft.com/en-us/azure/hdinsight/hdinsight-streaming-at-scale-overview?toc=https%3A%2F
%2Fdocs.microsoft.com%2Fen-us%2Fazure%2Fhdinsight%2Fhadoop%2FTOC.json&bc=https%3A%2F%
2Fdocs.microsoft.com%2Fen-us%2Fazure%2Fbread%2Ftoc.json
QUESTION 5
You are designing an AI application that will use an azure Machine Learning Studio experiment.
The source data contains more than 200 TB of relational tables. The experiment will run once a month.
You need to identify a data storage solution for the application. The solution must minimize compute costs.
Correct Answer: C
FF3B4C4270CB86445E2DB637EF5B166D
Section: (none)
Explanation
Explanation/Reference:
References:
https://azure.microsoft.com/en-us/pricing/details/sql-database/single/
QUESTION 6
HOTSPOT
You are designing a solution that will analyze bank transactions in real time. The transactions will be
evaluated by using an algorithm and classified into one of five groups. The transaction data will be enriched
with information taken from Azure SQL Database before the transactions are sent to the classification
process. The enrichment process will require custom code. Data from different banks will require different
stored procedures.
Which components should you use for data ingestion and data preparation? To answer, select the
appropriate options in the answer area.
Hot Area:
Correct Answer:
FF3B4C4270CB86445E2DB637EF5B166D
Section: (none)
Explanation
Explanation/Reference:
References:
https://docs.microsoft.com/bs-latn-ba/azure/architecture/example-scenario/data/fraud-detection
QUESTION 7
DRAG DROP
You are designing an Azure Batch AI solution that will be used to train many different Azure Machine
Learning models. The solution will perform the following:
Image recognition
Deep learning that uses convolutional neural networks.
You need to select a compute infrastructure for each model. The solution must minimize the processing
time.
What should you use for each model? To answer, drag the appropriate compute infrastructures to the
correct models. Each compute infrastructure may be used once, more than once, or not at all. You may
need to drag the split bar between panes or scroll to view content.
FF3B4C4270CB86445E2DB637EF5B166D
Correct Answer:
Section: (none)
Explanation
Explanation/Reference:
References:
https://docs.microsoft.com/en-us/azure/virtual-machines/windows/sizes-gpu
QUESTION 8
You design an AI workflow that combines data from multiple data sources for analysis. The data sources
are composed of:
Correct Answer: A
Section: (none)
Explanation
Explanation/Reference:
References:
FF3B4C4270CB86445E2DB637EF5B166D
https://docs.microsoft.com/en-us/azure/data-factory/introduction
QUESTION 9
HOTSPOT
You are designing a solution that will ingest temperature data from IoT devices, calculate the average
temperature, and then take action based on the aggregated data. The solution must meet the following
requirements:
What should you include in the solution? To answer, select the appropriate options in the answer area.
Hot Area:
Correct Answer:
FF3B4C4270CB86445E2DB637EF5B166D
Section: (none)
Explanation
Explanation/Reference:
Box 1: Azure Functions
Azure Function is a (serverless) service to host functions (little piece of code) that can be used for e. g.
event driven applications.
General rule is always difficult since everything depends on your requirement but if you have to analyze a
data stream, you should take a look at Azure Stream Analytics and if you want to implement something like
a serverless event driven or timer-based application, you should check Azure Function or Logic Apps.
Note: Azure IoT Edge allows you to deploy complex event processing, machine learning, image recognition,
and other high value AI without writing it in-house. Azure services like Azure Functions, Azure Stream
Analytics, and Azure Machine Learning can all be run on-premises via Azure IoT Edge.
References:
https://docs.microsoft.com/en-us/azure/iot-edge/about-iot-edge
QUESTION 10
Note: This question is part of a series of questions that present the same scenario. Each question
in the series contains a unique solution that might meet the stated goals. Some question sets might
have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will
not appear in the review screen.
You have an app named App1 that uses the Face API.
You discover that a PersonGroup object for an individual named Ben Smith cannot accept additional
entries. The PersonGroup object for Ben Smith contains 10,000 entries.
You need to ensure that additional entries can be added to the PersonGroup object for Ben Smith. The
solution must ensure that Ben Smith can be identified by all the entries.
FF3B4C4270CB86445E2DB637EF5B166D
Solution: You modify the custom time interval for the training phase of App1.
A. Yes
B. No
Correct Answer: B
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Instead, use a LargePersonGroup. LargePersonGroup and LargeFaceList are collectively referred to as
large-scale operations. LargePersonGroup can contain up to 1 million persons, each with a maximum of
248 faces. LargeFaceList can contain up to 1 million faces. The large-scale operations are similar to the
conventional PersonGroup and FaceList but have some differences because of the new architecture.
References:
https://docs.microsoft.com/en-us/azure/cognitive-services/face/face-api-how-to-topics/how-to-use-large-
scale
QUESTION 11
Note: This question is part of a series of questions that present the same scenario. Each question
in the series contains a unique solution that might meet the stated goals. Some question sets might
have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will
not appear in the review screen.
You have an app named App1 that uses the Face API.
You discover that a PersonGroup object for an individual named Ben Smith cannot accept additional
entries. The PersonGroup object for Ben Smith contains 10,000 entries.
You need to ensure that additional entries can be added to the PersonGroup object for Ben Smith. The
solution must ensure that Ben Smith can be identified by all the entries.
A. Yes
B. No
Correct Answer: B
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Instead, use a LargePersonGroup. LargePersonGroup and LargeFaceList are collectively referred to as
large-scale operations. LargePersonGroup can contain up to 1 million persons, each with a maximum of
248 faces. LargeFaceList can contain up to 1 million faces. The large-scale operations are similar to the
conventional PersonGroup and FaceList but have some differences because of the new architecture.
References:
https://docs.microsoft.com/en-us/azure/cognitive-services/face/face-api-how-to-topics/how-to-use-large-
scale
FF3B4C4270CB86445E2DB637EF5B166D
QUESTION 12
Note: This question is part of a series of questions that present the same scenario. Each question
in the series contains a unique solution that might meet the stated goals. Some question sets might
have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will
not appear in the review screen.
You have an app named App1 that uses the Face API.
You discover that a PersonGroup object for an individual named Ben Smith cannot accept additional
entries. The PersonGroup object for Ben Smith contains 10,000 entries.
You need to ensure that additional entries can be added to the PersonGroup object for Ben Smith. The
solution must ensure that Ben Smith can be identified by all the entries.
Solution: You migrate all the entries to the LargePersonGroup object for Ben Smith.
A. Yes
B. No
Correct Answer: A
Section: (none)
Explanation
Explanation/Reference:
Explanation:
LargePersonGroup and LargeFaceList are collectively referred to as large-scale operations.
LargePersonGroup can contain up to 1 million persons, each with a maximum of 248 faces. LargeFaceList
can contain up to 1 million faces. The large-scale operations are similar to the conventional PersonGroup
and FaceList but have some differences because of the new architecture.
References:
https://docs.microsoft.com/en-us/azure/cognitive-services/face/face-api-how-to-topics/how-to-use-large-
scale
QUESTION 13
Your company plans to develop a mobile app to provide meeting transcripts by using speech-to-text. Audio
from the meetings will be streamed to provide real-time transcription.
You need to recommend which task each meeting participant must perform to ensure that the transcripts of
the meetings can identify all participants.
Correct Answer: B
Section: (none)
Explanation
Explanation/Reference:
Explanation:
The first step is to create voice signatures for the conversation participants. Creating voice signatures is
FF3B4C4270CB86445E2DB637EF5B166D
required for efficient speaker identification.
Note: In addition to the standard baseline model used by the Speech Services, you can customize models
to your needs with available data, to overcome speech recognition barriers such as speaking style,
vocabulary and background noise.
References:
https://docs.microsoft.com/bs-latn-ba/azure/cognitive-services/speech-service/how-to-use-conversation-
transcription-service
QUESTION 14
You need to create a prototype of a bot to demonstrate a user performing a task. The demonstration will
use the Bot Framework Emulator.
Which botbuilder CLI tool should you use to create the prototype?
A. Chatdown
B. QnAMaker
C. Dispatch
D. LuDown
Correct Answer: A
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Use Chatdown to produce prototype mock conversations in markdown and convert the markdown to
transcripts you can load and view in the new V4 Bot Framework Emulator.
Incorrect Answers:
B: QnA Maker is a cloud-based API service that lets you create a conversational question-and-answer layer
over your existing data. Use it to build a knowledge base by extracting questions and answers from your
semi-structured content, including FAQs, manuals, and documents. Answer users’ questions with the best
answers from the QnAs in your knowledge base—automatically. Your knowledge base gets smarter, too, as
it continually learns from user behavior.
C: Dispatch lets you build language models that allow you to dispatch between disparate components (such
as QnA, LUIS and custom code).
References:
https://github.com/microsoft/botframework/blob/master/README.md
QUESTION 15
DRAG DROP
The bot must support multiple bot channels including Direct Line.
Users must be able to sign in to the bot by using a Gmail user account and save activities and
preferences.
Which four actions should you perform in sequence? To answer, move the appropriate actions from the list
of actions to the answer area and arrange them in the correct order.
NOTE: More than one order of answer choices is correct. You will receive credit for any of the correct
orders you select.
FF3B4C4270CB86445E2DB637EF5B166D
Correct Answer:
Section: (none)
Explanation
Explanation/Reference:
Step 1: From the Azure portal, configure an identity provider.
The Azure Bot Service and the v4 SDK include new bot authentication capabilities, providing features to
make it easier to develop a bot that authenticates users to various identity providers, such as Azure AD
(Azure Active Directory), GitHub, Uber, and so on.
Step 2: From the Azure portal, create an Azure Active Directory (Azure AD) B2C service.
Azure Active Directory B2C provides business-to-customer identity as a service. Your customers use their
preferred social, enterprise, or local account identities to get single sign-on access to your applications and
APIs.
FF3B4C4270CB86445E2DB637EF5B166D
Step 4: From the bot code, add the connection settings and OAuthPrompt
Use an OAuth prompt to sign the user in and get a token.
Azure AD B2C uses standards-based authentication protocols including OpenID Connect, OAuth 2.0, and
SAML.
References:
https://docs.microsoft.com/en-us/azure/bot-service/bot-builder-authentication?view=azure-bot-service-4.0
QUESTION 16
HOTSPOT
You have an app that uses the Language Understanding (LUIS) API as shown in the following exhibit.
Use the drop-down menus to select the answer choice that completes each statement based on the
information presented in the graphic.
Hot Area:
Correct Answer:
FF3B4C4270CB86445E2DB637EF5B166D
Section: (none)
Explanation
Explanation/Reference:
Box 1: train
Utterances are input from the user that your app needs to interpret. To train LUIS to extract intents and
entities from them, it's important to capture a variety of different example utterances for each intent. Active
learning, or the process of continuing to train on new utterances, is essential to machine-learned
intelligence that LUIS provides.
References:
https://docs.microsoft.com/en-us/azure/cognitive-services/luis/luis-concept-utteran3ce
QUESTION 17
You are designing an AI solution that will provide feedback to teachers who train students over the Internet.
The students will be in classrooms located in remote areas. The solution will capture video and audio data
of the students in the classrooms.
You need to recommend Azure Cognitive Services for the AI solution to meet the following requirements:
Alert teachers if a student facial expression indicates the student is angry or scared.
Identify each student in the classrooms for attendance purposes.
Allow the teachers to log voice conversations as text.
Correct Answer: D
FF3B4C4270CB86445E2DB637EF5B166D
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Speech-to-text from Azure Speech Services, also known as speech-to-text, enables real-time transcription
of audio streams into text that your applications, tools, or devices can consume, display, and take action on
as command input.
Face detection: Detect one or more human faces in an image and get back face rectangles for where in the
image the faces are, along with face attributes which contain machine learning-based predictions of facial
features. The face attribute features available are: Age, Emotion, Gender, Pose, Smile, and Facial Hair
along with 27 landmarks for each face in the image.
References:
https://docs.microsoft.com/en-us/azure/cognitive-services/speech-service/speech-to-text
https://azure.microsoft.com/en-us/services/cognitive-services/face/
QUESTION 18
HOTSPOT
Your company plans to build an app that will perform the following tasks:
You need to recommend which Azure Cognitive Services APIs must be used to perform the tasks.
Which Cognitive Services API should you recommend for each task? To answer, select the appropriate
options in the answer area.
Hot Area:
Correct Answer:
FF3B4C4270CB86445E2DB637EF5B166D
Section: (none)
Explanation
Explanation/Reference:
Box 1: Computer Vision
Azure's Computer Vision service provides developers with access to advanced algorithms that process
images and return information.
Computer Vision Detect Faces: Detect faces in an image and provide information about each detected
face. Computer Vision returns the coordinates, rectangle, gender, and age for each detected face.
Computer Vision provides a subset of the Face service functionality. You can use the Face service for more
detailed analysis, such as facial identification and pose detection.
Incorrect Answers:
Video Indexer:
Automatically extract metadata—such as spoken words, written text, faces, speakers, celebrities, emotions,
topics, brands, and scenes—from video and audio files.
Custom Vision:
Easily customize your own state-of-the-art computer vision models for your unique use case. Just upload a
few labeled images and let Custom Vision Service do the hard work. With just one click, you can export
trained models to be run on device or as Docker containers.
References:
https://docs.microsoft.com/en-us/azure/cognitive-services/computer-vision/home
https://azure.microsoft.com/en-us/services/cognitive-services/bing-video-search-api/
QUESTION 19
You need to evaluate trends in fuel prices during a period of 10 years. The solution must identify unusual
fluctuations in prices and produce visual representations.
A. Anomaly Detector
B. Computer Vision
FF3B4C4270CB86445E2DB637EF5B166D
C. Text Analytics
D. Bing Autosuggest
Correct Answer: A
Section: (none)
Explanation
Explanation/Reference:
Explanation:
The Anomaly Detector API enables you to monitor and detect abnormalities in your time series data with
machine learning. The Anomaly Detector API adapts by automatically identifying and applying the best-
fitting models to your data, regardless of industry, scenario, or data volume. Using your time series data,
the API determines boundaries for anomaly detection, expected values, and which data points are
anomalies.
References:
https://docs.microsoft.com/en-us/azure/cognitive-services/anomaly-detector/overview
QUESTION 20
HOTSPOT
Your company plans to deploy several apps that will use Azure Cognitive Services APIs.
You need to recommend which Cognitive Services APIs must be used to meet the following requirements:
Which API should you recommend for each requirement? To answer, select the appropriate options in the
answer area.
Hot Area:
Correct Answer:
FF3B4C4270CB86445E2DB637EF5B166D
Section: (none)
Explanation
Explanation/Reference:
Box 1: Content Moderator
The Azure Content Moderator API is a cognitive service that checks text, image, and video content for
material that is potentially offensive, risky, or otherwise undesirable. When such material is found, the
service applies appropriate labels (flags) to the content. Your app can then handle flagged content in order
to comply with regulations or maintain the intended environment for users.
References:
https://docs.microsoft.com/bs-latn-ba/azure/cognitive-services/content-moderator/overview
https://www.luis.ai/home
https://docs.microsoft.com/en-us/azure/cognitive-services/text-analytics/
QUESTION 21
You plan to perform analytics of the medical records of patients located around the world.
You need to recommend a solution that avoids storing and processing data in the cloud.
FF3B4C4270CB86445E2DB637EF5B166D
C. Azure Machine Learning services
D. an Apache Spark cluster that uses MMLSpark
Correct Answer: D
Section: (none)
Explanation
Explanation/Reference:
Explanation:
The Microsoft Machine Learning Library for Apache Spark (MMLSpark) assists in provisioning scalable
machine learning models for large datasets, especially for building deep learning problems. MMLSpark
works with SparkML pipelines, including Microsoft CNTK and the OpenCV library, which provide end-to-end
support for the ingress and processing of image input data, categorization of images, and text analytics
using pre-trained deep learning algorithms.
References:
https://subscription.packtpub.com/book/big_data_and_business_intelligence/9781789131956/10/
ch10lvl1sec61/an-overview-of-the-microsoft-machine-learning-library-for-apache-spark-mmlspark
QUESTION 22
Your company has an on-premises datacenter.
You plan to publish an app that will recognize a set of individuals by using the Face API. The model is
trained.
You need to ensure that all images are processed in the on-premises datacenter.
A. a Docker container
B. Azure File Sync
C. Azure Application Gateway
D. Azure Data Box Edge
Correct Answer: A
Section: (none)
Explanation
Explanation/Reference:
Explanation:
A container is a standard unit of software that packages up code and all its dependencies so the application
runs quickly and reliably from one computing environment to another. A Docker container image is a
lightweight, standalone, executable package of software that includes everything needed to run an
application: code, runtime, system tools, system libraries and settings.
Incorrect Answers:
D: Azure Data Box Edge is an AI-enabled edge computing device with network data transfer capabilities.
This article provides you an overview of the Data Box Edge solution, benefits, key capabilities, and the
scenarios where you can deploy this device.
Data Box Edge is a Hardware-as-a-service solution. Microsoft ships you a cloud-managed device with a
built-in Field Programmable Gate Array (FPGA) that enables accelerated AI-inferencing and has all the
capabilities of a storage gateway.
References:
https://www.docker.com/resources/what-container
QUESTION 23
You have a Bing Search service that is used to query a product catalog.
FF3B4C4270CB86445E2DB637EF5B166D
The top 50 query strings
The number of calls to the service
The top geographical regions of the service
A. Bing Statistics
B. Azure API Management (APIM)
C. Azure Monitor
D. Azure Application Insights
Correct Answer: A
Section: (none)
Explanation
Explanation/Reference:
Explanation:
The Bing Statistics add-in provides metrics such as call volume, top queries, API response, code
distribution, and market distribution. The rich slicing-and-dicing capability lets you gather deeper
understanding of your users and their usage to inform your business strategy.
References:
https://www.bingapistatistics.com/
QUESTION 24
You have a Face API solution that updates in real time. A pilot of the solution runs successfully on a small
dataset.
When you attempt to use the solution on a larger dataset that continually changes, the performance
degrades, slowing how long it takes to recognize existing faces.
You need to recommend changes to reduce the time it takes to recognize existing faces without increasing
costs.
A. Change the solution to use the Computer Vision API instead of the Face API.
B. Separate training into an independent pipeline and schedule the pipeline to run daily.
C. Change the solution to use the Bing Image Search API instead of the Face API.
D. Distribute the face recognition inference process across many Azure Cognitive Services instances.
Correct Answer: B
Section: (none)
Explanation
Explanation/Reference:
Incorrect Answers:
A: The purpose of Computer Vision is to inspects each image associated with an incoming article to (1)
scrape out written words from the image and (2) determine what types of objects are present in the image.
C: The Bing API provides an experience similar to Bing.com/search by returning search results that Bing
determines are relevant to a user's query. The results include Web pages and may also include images,
videos, and more.
References:
https://github.com/Azure/cognitive-services
QUESTION 25
HOTSPOT
FF3B4C4270CB86445E2DB637EF5B166D
You plan to create a bot that will support five languages. The bot will be used by users located in three
different countries. The bot will answer common customer questions. The bot will use Language
Understanding (LUIS) to identify which skill to use and to detect the language of the customer.
You need to identify the minimum number of Azure resources that must be created for the planned bot.
How many QnA Maker, LUIS and Language Detection instances should you create? To answer, select the
appropriate options in the answer area.
Hot Area:
Correct Answer:
FF3B4C4270CB86445E2DB637EF5B166D
Section: (none)
Explanation
Explanation/Reference:
QnA Maker: 5
If the user plans to support multiple languages, they need to have a new QnA Maker resource for each
language.
LUIS: 5
If you need a multi-language LUIS client application such as a chatbot, you have a few options. If LUIS
supports all the languages, you develop a LUIS app for each language. Each LUIS app has a unique app
ID, and endpoint log. If you need to provide language understanding for a language LUIS does not support,
you can use Microsoft Translator API to translate the utterance into a supported language, submit the
utterance to the LUIS endpoint, and receive the resulting scores.
Language detection: 1
The Language Detection feature of the Azure Text Analytics REST API evaluates text input for each
document and returns language identifiers with a score that indicates the strength of the analysis.
This capability is useful for content stores that collect arbitrary text, where language is unknown. You can
parse the results of this analysis to determine which language is used in the input document. The response
also returns a score that reflects the confidence of the model. The score value is between 0 and 1.
The Language Detection feature can detect a wide range of languages, variants, dialects, and some
regional or cultural languages. The exact list of languages for this feature isn't published.
References:
https://docs.microsoft.com/en-us/azure/cognitive-services/qnamaker/overview/language-support
https://docs.microsoft.com/en-us/azure/cognitive-services/luis/luis-language-support
https://docs.microsoft.com/en-us/azure/cognitive-services/text-analytics/how-tos/text-analytics-how-to-
language-detection
FF3B4C4270CB86445E2DB637EF5B166D
QUESTION 26
You have a database that contains sales data.
You plan to process the sales data by using two data streams named Stream1 and Stream2. Stream1 will
be used for purchase order data. Stream2 will be used for reference data.
What two solutions should you recommend? Each correct answer is a complete solution.
A. an Azure event hub for Stream1 and Azure Blob storage for Stream2
B. Azure Blob storage for Stream1 and Stream2
C. an Azure event hub for Stream1 and Stream2
D. Azure Blob storage for Stream1 and Azure Cosmos DB for Stream2
E. Azure Cosmos DB for Stream1 and an Azure event hub for Stream2
Correct Answer: AB
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Stream1 - Azure Event
Stream2 - Blob Storage
Azure Event Hubs is a highly scalable data streaming platform and event ingestion service, capable of
receiving and processing millions of events per second. Event Hubs can process and store events, data, or
telemetry produced by distributed software and devices. Data sent to an event hub can be transformed and
stored using any real-time analytics provider or batching/storage adapters. Event Hubs provides publish-
subscribe capabilities with low latency at massive scale, which makes it appropriate for big data scenarios.
References:
https://docs.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/real-time-ingestion
QUESTION 27
Note: This question is part of a series of questions that present the same scenario. Each question
in the series contains a unique solution that might meet the stated goals. Some question sets might
have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will
not appear in the review screen.
You are developing an application that uses an Azure Kubernetes Service (AKS) cluster.
Solution: You create a managed identity for AKS, and then you create an SSH connection.
FF3B4C4270CB86445E2DB637EF5B166D
Does this meet the goal?
A. Yes
B. No
Correct Answer: B
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Instead add an SSH key to the node, and then you create an SSH connection.
References:
https://docs.microsoft.com/en-us/azure/aks/ssh
QUESTION 28
Note: This question is part of a series of questions that present the same scenario. Each question
in the series contains a unique solution that might meet the stated goals. Some question sets might
have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will
not appear in the review screen.
You are developing an application that uses an Azure Kubernetes Service (AKS) cluster.
Solution: You change the permissions of the AKS resource group, and then you create an SSH connection.
A. Yes
B. No
Correct Answer: B
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Instead add an SSH key to the node, and then you create an SSH connection.
References:
https://docs.microsoft.com/en-us/azure/aks/ssh
QUESTION 29
Note: This question is part of a series of questions that present the same scenario. Each question
in the series contains a unique solution that might meet the stated goals. Some question sets might
have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will
not appear in the review screen.
You are developing an application that uses an Azure Kubernetes Service (AKS) cluster.
Solution: You add an SSH key to the node, and then you create an SSH connection.
FF3B4C4270CB86445E2DB637EF5B166D
Does this meet the goal?
A. Yes
B. No
Correct Answer: A
Section: (none)
Explanation
Explanation/Reference:
Explanation:
By default, SSH keys are generated when you create an AKS cluster. If you did not specify your own SSH
keys when you created your AKS cluster, add your public SSH keys to the AKS nodes.
You also need to create an SSH connection to the AKS node.
References:
https://docs.microsoft.com/en-us/azure/aks/ssh
QUESTION 30
You are developing a Computer Vision application.
You plan to use a workflow that will load data from an on-premises database to Azure Blob storage, and
then connect to an Azure Machine Learning service.
Correct Answer: C
Section: (none)
Explanation
Explanation/Reference:
Explanation:
With Azure Data Factory you can use workflows to orchestrate data integration and data transformation
processes at scale.
Build data integration, and easily transform and integrate big data processing and machine learning with the
visual interface.
References:
https://azure.microsoft.com/en-us/services/data-factory/
QUESTION 31
DRAG DROP
You are designing an AI solution that will use IoT devices to gather data from conference attendees, and
then later analyze the data. The IoT devices will connect to an Azure IoT hub.
You need to design a solution to anonymize the data before the data is sent to the IoT hub.
Which three actions should you perform in sequence? To answer, move the appropriate actions from the
list of actions to the answer area and arrange them in the correct order.
FF3B4C4270CB86445E2DB637EF5B166D
Correct Answer:
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Scenario overview:
FF3B4C4270CB86445E2DB637EF5B166D
Step 3: Add the job to the IoT devices in IoT
References:
https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-edge
QUESTION 32
HOTSPOT
You are designing a solution that will ingest data from an Azure IoT Edge device, preprocess the data in
Azure Machine Learning, and then move the data to Azure HDInsight for further processing.
What should you include in the solution? To answer, select the appropriate options in the answer area.
Hot Area:
FF3B4C4270CB86445E2DB637EF5B166D
Correct Answer:
FF3B4C4270CB86445E2DB637EF5B166D
Section: (none)
Explanation
Explanation/Reference:
Explanation:
References:
https://docs.microsoft.com/en-us/azure/machine-learning/studio-module-reference/export-to-hive-query
https://docs.microsoft.com/en-us/azure/hdinsight/hadoop/hdinsight-use-hive
QUESTION 33
Note: This question is part of a series of questions that present the same scenario. Each question
in the series contains a unique solution that might meet the stated goals. Some question sets might
have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will
not appear in the review screen.
You need to create an IoT solution that performs the following tasks:
Identifies hazards
Provides a real-time online dashboard
Takes images of an area every minute
Counts the number of people in an area every minute
Solution: You implement Azure Cognitive Services containers on the IoT devices, and then you configure
results to be sent to an Azure IoT hub. You configure Microsoft Power BI to connect to the IoT hub by using
Azure Stream Analytics.
A. Yes
B. No
Correct Answer: A
Section: (none)
Explanation
Explanation/Reference:
Explanation:
There is support for running Azure Cognitive Services containers for Text Analytics and Language
Understanding containers on edge devices with Azure IoT Edge. This means that all your workloads can be
run locally where your data is being generated while keeping the simplicity of the cloud to manage them
remotely, securely and at scale.
You would have to set up an IoT Edge device and its IoT Hub.
Note: Azure Stream Analytics enables you to take advantage of one of the leading business intelligence
tools, Microsoft Power BI.
FF3B4C4270CB86445E2DB637EF5B166D
Get your IoT hub ready for data access by adding a consumer group.
Create, configure, and run a Stream Analytics job for data transfer from your IoT hub to your Power BI
account.
Create and publish a Power BI report to visualize the data.
References:
https://azure.microsoft.com/es-es/blog/running-cognitive-services-on-iot-edge/
https://docs.microsoft.com/en-us/azure/iot-hub/iot-hub-live-data-visualization-in-power-bi
QUESTION 34
Note: This question is part of a series of questions that present the same scenario. Each question
in the series contains a unique solution that might meet the stated goals. Some question sets might
have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will
not appear in the review screen.
You need to create an IoT solution that performs the following tasks:
Identifies hazards
Provides a real-time online dashboard
Takes images of an area every minute
Counts the number of people in an area every minute
Solution: You configure the IoT devices to send the images to an Azure IoT hub, and then you configure an
Azure Functions call to Azure Cognitive Services that sends the results to an Azure event hub. You
configure Microsoft Power BI to connect to the event hub by using Azure Stream Analytics.
A. Yes
B. No
Correct Answer: B
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Instead use Cognitive Services containers on the IoT devices.
References:
https://azure.microsoft.com/es-es/blog/running-cognitive-services-on-iot-edge/
https://docs.microsoft.com/en-us/azure/iot-hub/iot-hub-live-data-visualization-in-power-bi
QUESTION 35
Note: This question is part of a series of questions that present the same scenario. Each question
in the series contains a unique solution that might meet the stated goals. Some question sets might
have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will
not appear in the review screen.
You need to create an IoT solution that performs the following tasks:
Identifies hazards
Provides a real-time online dashboard
Takes images of an area every minute
Counts the number of people in an area every minute
Solution: You configure the IoT devices to send the images to an Azure IoT hub, and then you configure an
FF3B4C4270CB86445E2DB637EF5B166D
Azure Automation call to Azure Cognitive Services that sends the results to an Azure event hub. You
configure Microsoft Power BI to connect to the event hub by using Azure Stream Analytics.
A. Yes
B. No
Correct Answer: B
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Instead use Cognitive Services containers on the IoT devices.
References:
https://azure.microsoft.com/es-es/blog/running-cognitive-services-on-iot-edge/
https://docs.microsoft.com/en-us/azure/iot-hub/iot-hub-live-data-visualization-in-power-bi
QUESTION 36
You plan to deploy Azure IoT Edge devices. Each device will store more than 10,000 images locally. Each
image is approximately 5 MB.
You need to ensure that the images persist on the devices for 14 days.
Correct Answer: C
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Azure Blob Storage on IoT Edge provides a block blob and append blob storage solution at the edge. A
blob storage module on your IoT Edge device behaves like an Azure blob service, except the blobs are
stored locally on your IoT Edge device.
This e is useful where data needs to be stored locally until it can be processed or transferred to the cloud.
This data can be videos, images, finance data, hospital data, or any other unstructured data.
References:
https://docs.microsoft.com/en-us/azure/iot-edge/how-to-store-data-blob
QUESTION 37
You have an Azure Machine Learning experiment.
You need to validate that the experiment meets GDPR regulation requirements and stores documentation
about the experiment.
A. Compliance Manager
B. an Azure Log Analytics workspace
C. Azure Table storage
D. Azure Security Center
FF3B4C4270CB86445E2DB637EF5B166D
Correct Answer: A
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Compliance Manager for Azure helps you assess and manage GDPR compliance. Compliance Manager is
a free, Microsoft cloud services solution designed to help organizations meet complex compliance
obligations, including the GDPR, ISO 27001, ISO 27018, and NIST 800-53. Generally available today for
Azure customers, the Compliance Manager GDPR dashboard enables you to assign, track, and record your
GDPR compliance activities so you can collaborate across teams and manage your documents for creating
audit reports more easily.
References:
https://azure.microsoft.com/en-us/blog/new-capabilities-to-enable-robust-gdpr-compliance/
QUESTION 38
You are designing a solution that will integrate the Bing Web Search API and will return a JSON response.
The development team at your company uses C# as its primary development language.
Which additional component do the developers need to prepare and to retrieve data by using an API call?
A. the subscription ID
B. the API key
C. a query
D. the resource group ID
Correct Answer: B
Section: (none)
Explanation
Explanation/Reference:
Explanation:
The Bing Web Search SDK makes it easy to integrate Bing Web Search into your C# application. You
instantiate a client, send a request, and receive a response.
References:
https://docs.microsoft.com/en-us/azure/cognitive-services/bing-web-search/web-search-sdk-quickstart
QUESTION 39
DRAG DROP
You need to build an AI solution that will be shared between several developers and customers.
You plan to write code, host code, and document the runtime all within a single user experience.
Which three actions should you perform in sequence next? To answer, move the appropriate actions from
the list of actions to the answer area and arrange them in the correct order.
FF3B4C4270CB86445E2DB637EF5B166D
Correct Answer:
Section: (none)
Explanation
Explanation/Reference:
Explanation:
FF3B4C4270CB86445E2DB637EF5B166D
You can manage notebooks using the UI, the CLI, and by invoking the Workspace API.
To create a notebook
1. Click the Workspace button Workspace Icon or the Home button Home Icon in the sidebar. Do one of
the following:
Next to any folder, click the Menu Dropdown on the right side of the text and select Create > Notebook.
Create Notebook
In the Workspace or a user folder, click Down Caret and select Create > Notebook.
2. In the Create Notebook dialog, enter a name and select the notebook’s primary language.
3. If there are running clusters, the Cluster drop-down displays. Select the cluster to attach the notebook to.
4. Click Create.
References:
https://docs.azuredatabricks.net/user-guide/notebooks/notebook-manage.html
https://docs.microsoft.com/en-us/azure/machine-learning/service/quickstart-run-cloud-notebook
QUESTION 40
Your company has a data team of Transact-SQL experts.
You plan to ingest data from multiple sources into Azure Event Hubs.
You need to recommend which technology the data team should use to move and query data from Event
Hubs to Azure Storage. The solution must leverage the data team’s existing skills.
What is the best recommendation to achieve the goal? More than one answer choice may achieve the goal.
Correct Answer: B
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Event Hubs Capture is the easiest way to automatically deliver streamed data in Event Hubs to an Azure
Blob storage or Azure Data Lake store. You can subsequently process and deliver the data to any other
storage destinations of your choice, such as SQL Data Warehouse or Cosmos DB.
You to capture data from your event hub into a SQL data warehouse by using an Azure function triggered
by an event grid.
Example:
First, you create an event hub with the Capture feature enabled and set an Azure blob storage as the
destination. Data generated by WindTurbineGenerator is streamed into the event hub and is automatically
captured into Azure Storage as Avro files.
FF3B4C4270CB86445E2DB637EF5B166D
Next, you create an Azure Event Grid subscription with the Event Hubs namespace as its source and the
Azure Function endpoint as its destination.
Whenever a new Avro file is delivered to the Azure Storage blob by the Event Hubs Capture feature, Event
Grid notifies the Azure Function with the blob URI. The Function then migrates data from the blob to a SQL
data warehouse.
References:
https://docs.microsoft.com/en-us/azure/event-hubs/store-captured-data-data-warehouse
QUESTION 41
You are designing a Computer Vision AI application.
You need to recommend a deployment solution for the application. The solution must ensure that costs
scale linearly without any upfront costs.
Correct Answer: B
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Containers enable you to run the Computer Vision APIs in your own environment.
Note: The host is a x64-based computer that runs the Docker container. It can be a computer on your
premises or a Docker hosting service in Azure, such as:
Azure Container Instances.
Azure Kubernetes Service.
A Kubernetes cluster deployed to Azure Stack.
References:
https://docs.microsoft.com/en-us/azure/cognitive-services/computer-vision/computer-vision-how-to-install-
containers
QUESTION 42
You are implementing the Language Understanding (LUIS) API and are building a GDPR-compliant bot by
using the Bot Framework.
You need to recommend a solution to ensure that the implementation of LUIS is GDPR-compliant.
Correct Answer: C
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Deleting personal data from the device or service and can be used to support your obligations under the
GDPR.
FF3B4C4270CB86445E2DB637EF5B166D
References:
https://docs.microsoft.com/bs-latn-ba/azure/cognitive-services/luis/luis-user-privacy
QUESTION 43
You need to build a reputation monitoring solution that reviews Twitter activity about your company. The
solution must identify negative tweets and tweets that contain inappropriate images.
Which two additional Azure services should you include in the solution? Each correct answer presents part
of the solution.
A. Corporate Vision
B. Azure Blueprint
C. Content Moderator
D. Text Analytics
E. Azure Machine Learning Service
F. Form Recognizer
Correct Answer: CD
Section: (none)
Explanation
Explanation/Reference:
Explanation:
C: You can filter your tweets using Azure Logic Apps & Content Moderation. Azure Content Moderator is a
cognitive service that checks text, image, and video content for material that is potentially offensive, risky,
or otherwise undesirable. When this material is found, the service applies appropriate labels (flags) to the
content. Your app can then handle flagged content in order to comply with regulations or maintain the
intended environment for users.
D: You can write an application so that when a user tweets with configured Twitter Hashtag, Logic App gets
triggered and passed to Cognitive Text Analytics Connector for detecting the sentiments of the tweet (text).
If the tweeted text is found to be harsh or with bad or abusive language, the tweet can be handled
appropriately.
References:
https://docs.microsoft.com/en-us/azure/cognitive-services/content-moderator/overview
https://www.c-sharpcorner.com/article/role-of-text-analytics-service-as-a-connector-in-azure-logic-apps/
QUESTION 44
Your company uses an internal blog to share news with employees.
You use the Translator Text API to translate the text in the blog from English to several other languages
used by the employee.
A. Text Analytics
B. Language Understanding (LUIS)
C. Azure Media Services
D. Custom Translator
Correct Answer: D
Section: (none)
FF3B4C4270CB86445E2DB637EF5B166D
Explanation
Explanation/Reference:
Explanation:
Custom Translator is a feature of the Microsoft Translator service. With Custom Translator, enterprises,
app developers, and language service providers can build neural translation systems that understand the
terminology used in their own business and industry. The customized translation system will then
seamlessly integrate into existing applications, workflows and websites.
Custom Translator allows users to customize Microsoft Translator’s advanced neural machine translation
for Translator’s supported neural translation languages. Custom Translator can be used for customizing
text when using the Microsoft Translator Text API , and speech translation using the Microsoft Speech
services.
References:
https://www.microsoft.com/en-us/translator/business/customization/
QUESTION 45
You plan to develop a bot that tracks communications between the employees at your company.
You need to identify which channel the bot must use to monitor reactions to messages by employees.
A. Microsoft Cortana
B. Microsoft Outlook
C. Microsoft Teams
Correct Answer: C
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Bots in Microsoft Teams can be part of a one-to-one conversation, a group chat, or a channel in a Team.
Note: In Microsoft Teams, teams are groups of people brought together for work, projects, or common
interests. Teams are made up of channels. Each channel is built around a topic, like “Team Events,” a
department name, or just for fun. Channels are where you hold meetings, have conversations, and work on
files together.
References:
https://docs.microsoft.com/en-us/microsoftteams/platform/bots/what-are-bots
QUESTION 46
You plan to implement a bot that will require user authentication.
You need to recommend a secure solution that provides encryption for the authentication of the bot.
Which two security solutions should you include in the recommendation? Each correct answer presents a
complete solution.
A. NTLM
B. JSON Web Token (JWT)
C. API keys
D. smart cards
E. SSL/TLS
Correct Answer: BE
Section: (none)
FF3B4C4270CB86445E2DB637EF5B166D
Explanation
Explanation/Reference:
Explanation:
Your bot communicates with the Bot Connector service using HTTP over a secured channel (SSL/TLS).
JSON Web Tokens are used to encode tokens that are sent to and from the bot.
References:
https://docs.microsoft.com/en-us/azure/bot-service/rest-api/bot-framework-rest-connector-authentication
QUESTION 47
HOTSPOT
You are developing an application that will perform clickstream analysis. The application will ingest and
analyze millions of messages in the real time.
You need to ensure that communication between the application and devices is bidirectional.
What should you use for data ingestion and stream processing? To answer, select the appropriate options
in the answer area.
Hot Area:
Correct Answer:
FF3B4C4270CB86445E2DB637EF5B166D
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Note on why not Azure Event Hubs: An Azure IoT Hub contains an Event Hub and hence essentially is an
Event Hub plus additional features. An important additional feature is that an Event Hub can only receive
messages, whereas an IoT Hub additionally can also send messages to individual devices. Further, an
Event Hub has access security on hub level, whereas an IoT Hub is aware of the individual devices and can
grand and revoke access on device level.
References:
https://docs.microsoft.com/en-us/azure/iot-hub/iot-hub-compare-event-hubs
https://docs.microsoft.com/en-us/azure/hdinsight/hdinsight-machine-learning-overview
QUESTION 48
HOTSPOT
You are designing an Azure infrastructure to support an Azure Machine Learning solution that will have
multiple phases. The solution must meet the following requirements:
What should you use? To answer, select the appropriate options in the answer area.
Hot Area:
FF3B4C4270CB86445E2DB637EF5B166D
Correct Answer:
Section: (none)
Explanation
Explanation/Reference:
Explanation:
FF3B4C4270CB86445E2DB637EF5B166D
specific most front-end tiers to Azure with minimal configuration changes, extending their enterprise apps
for hybrid scenarios.
References:
https://azure.microsoft.com/is-is/blog/hybrid-connections-preview/
https://databricks.com/glossary/what-are-ml-pipelines
QUESTION 49
You plan to design a solution for an AI implementation that uses data from IoT devices.
You need to recommend a data storage solution for the IoT devices that meets the following requirements:
Correct Answer: C
Section: (none)
Explanation
Explanation/Reference:
Explanation:
You can use HDInsight to process streaming data that's received in real time from a variety of devices.
By combining enterprise-scale R analytics software with the power of Apache Hadoop and Apache Spark,
Microsoft R Server for HDInsight gives you the scale and performance you need. Multi-threaded math
libraries and transparent parallelization in R Server handle up to 1000x more data and up to 50x faster
speeds than open-source R, which helps you to train more accurate models for better predictions.
References:
https://docs.microsoft.com/en-us/azure/hdinsight/hadoop/apache-hadoop-introduction
QUESTION 50
Your company has factories in 10 countries. Each factory contains several thousand IoT devices.
You need to ingest the data from the IoT devices into a data warehouse.
Which two Microsoft Azure technologies should you use? Each correct answer presents part of the solution.
FF3B4C4270CB86445E2DB637EF5B166D
NOTE: Each correct selection is worth one point.
Correct Answer: CE
Section: (none)
Explanation
Explanation/Reference:
Explanation:
With Azure Data Lake Store (ADLS) serving as the hyper-scale storage layer and HDInsight serving as the
Hadoop-based compute engine services. It can be used for prepping large amounts of data for insertion
into a Data Warehouse
References:
https://www.blue-granite.com/blog/azure-data-lake-analytics-holds-a-unique-spot-in-the-modern-data-
architecture
QUESTION 51
You plan to deploy a bot that will use the following Azure Cognitive Services:
Your company’s compliance policy states that all data used by the bot must be stored in the on-premises
network.
Correct Answer: B
Section: (none)
Explanation
Explanation/Reference:
Explanation:
You can deploy LUIS on-premise as Docker Image in a container.
Note: Azure Cognitive LUIS service can be deployed on any hardware or on any host (Linus, Windows and
IOS). This feature allows enterprises to quickly train the LUIS model on the cloud and deploy it anywhere
which makes Cognitive services to be available truly to every person and every Organization -
“Democratizing AI”.
Reference:
https://www.linkedin.com/pulse/deploying-microsoft-azure-cognitive-luis-service-on-premise-s
QUESTION 52
HOTSPOT
Your company is building a cinema chatbot by using the Bot Framework and Language Understanding
(LUIS).
FF3B4C4270CB86445E2DB637EF5B166D
You are designing of the intents and the entities for LUIS.
You need to identify which entity types to use. The solution must minimize development effort.
Which entry type should you use for each entity? To answer, select the appropriate options in the answer
area.
Hot Area:
Correct Answer:
FF3B4C4270CB86445E2DB637EF5B166D
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Box 1: Prebuilt
Datetime is prebuilt.
Language Understanding (LUIS) provides prebuilt entities. When a prebuilt entity is included in your
application, LUIS includes the corresponding entity prediction in the endpoint response.
Box 2: Simple
Box 3: Composite
A composite entity is made up of other entities, such as prebuilt entities, simple, regular expression, and list
entities. The separate entities form a whole entity.
Reference:
https://docs.microsoft.com/en-us/azure/cognitive-services/luis/luis-reference-prebuilt-entities
https://docs.microsoft.com/en-us/azure/cognitive-services/luis/reference-entity-composite
QUESTION 53
Your company uses several bots. The bots use Azure Bot Service.
Several users report that some of the bots fail to return the expected results.
You need to request the appropriate role to access the service health of the bot service. The solution must
use the principle of least privilege.
FF3B4C4270CB86445E2DB637EF5B166D
Which role should you request?
Correct Answer: B
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Use the Reader role on the bot service to limit access and scope.
Note: Access management for cloud resources is a critical function for any organization that is using the
cloud. Azure role-based access control (Azure RBAC) helps you manage who has access to Azure
resources, what they can do with those resources, and what areas they have access to.
Azure includes several built-in roles that you can use. The Reader Role can view existing Azure resources.
Scope is the set of resources that the access applies to. When you assign a role, you can further limit the
actions allowed by defining a scope. In Azure, you can specify a scope at multiple levels: management
group, subscription, resource group, or resource.
Reference:
https://docs.microsoft.com/en-us/azure/role-based-access-control/overview
QUESTION 54
You build an internal application that uses the Computer Vision API.
You need to ensure that only specific employees can access the application.
Correct Answer: C
Section: (none)
Explanation
Explanation/Reference:
Explanation:
The app requires an Azure service principal account to deploy services to your Azure subscription. A
service principal lets you delegate specific permissions to an app using role-based access control.
Reference:
https://docs.microsoft.com/en-us/azure/cognitive-services/custom-vision-service/logo-detector-mobile
FF3B4C4270CB86445E2DB637EF5B166D
Design solutions
Testlet 2
Overview
Contoso, Ltd. has an office in New York to serve its North American customers and an office in Paris to
serve its European customers.
Existing Environment
Infrastructure
Each office has a small data center that hosts Active Directory services and a few off-the-shelf software
solutions used by internal users.
The network contains a single Active Directory forest that contains a single domain named contoso.com.
Azure Active Directory (Azure AD) Connect is used to extend identity management to Azure.
The company has an Azure subscription. Each office has an Azure ExpressRoute connection to the
subscription. The New York office connects to a virtual network hosted in the US East 2 Azure region. The
Paris office connects to a virtual network hosted in the West Europe Azure region.
The New York office has an Azure Stack Development Kit (ASDK) deployment that is used for development
and testing.
Contoso has a web app named Bookings hosted in an App Service Environment (ASE). The ASE is in the
virtual network in the East US 2 region. Contoso employees and customers use Bookings to reserve hotel
rooms.
Data Environment
Bookings connects to a Microsoft SQL Server database named hotelDB in the New York office.
The database has a view named vwAvailability that consolidates columns from the tables named Hotels,
Rooms, and RoomAvailability. The database contains data that was collected during the last 20 years.
Problem Statements
Contoso identifies the following issues with its current business model:
European users report that access to Booking is slow, and they lose customers who must wait on the
phone while they search for available rooms.
Users report that Bookings was unavailable during an outage in the New York data center for more than
24 hours.
Requirements
Contoso identifies the following issues with its current business model:
European users report that access to Bookings is slow, and they lose customers who must wait on the
phone while they search for available rooms.
Users report that Bookings was unavailable during on outage in the New York data center for more than
24 hours.
Business Goals
Contoso wants to provide a new version of the Bookings app that will provide a highly available, reliable
service for booking travel packages by interacting with a chatbot named Butler.
Technical requirements
FF3B4C4270CB86445E2DB637EF5B166D
Contoso identifies the following technical requirements:
QUESTION 1
You need to design the Butler chatbot solution to meet the technical requirements.
What is the best channel and pricing tier to use? More than one answer choice may achieve the goal.
Select the BEST answer.
Correct Answer: D
Section: (none)
Explanation
Explanation/Reference:
References:
https://azure.microsoft.com/en-in/pricing/details/bot-service/
QUESTION 2
You need to meet the testing requirements for the data scientists.
Which three actions should you perform? Each correct answer presents part of the solution.
Explanation/Reference:
FF3B4C4270CB86445E2DB637EF5B166D
Explanation:
Scenario: Data scientists must test Butler by using ASDK.
Note: Contoso wants to provide a new version of the Bookings app that will provide a highly available,
reliable service for booking travel packages by interacting with a chatbot named Butler.
E: The ASDK (Azure Stack Development Kit) is meant to provide an environment in which you can evaluate
Azure Stack and develop modern applications using APIs and tooling consistent with Azure in a non-
production environment.
Microsoft Azure Stack integrated systems range in size from 4-16 nodes, and are jointly supported by a
hardware partner and Microsoft.
F: The Language Understanding (LUIS) container loads your trained or published Language Understanding
model, also known as a LUIS app, into a docker container and provides access to the query predictions
from the container's API endpoints.
Use the docker pull command to download a container image from the mcr.microsoft.com/azure-cognitive-
services/luis repository:
G: You can test using the endpoint with a maximum of two versions of your app. With your main or live
version of your app set as the production endpoint, add a second version to the staging endpoint.
Reference:
https://docs.microsoft.com/en-us/azure-stack/asdk/asdk-what-is
https://docs.microsoft.com/en-us/azure/cognitive-services/luis/luis-container-howto
https://docs.microsoft.com/en-us/azure/cognitive-services/luis/luis-concept-test
FF3B4C4270CB86445E2DB637EF5B166D
Integrate AI models into solutions
Question Set 1
QUESTION 1
You need to configure versioning and logging for Azure Machine Learning models.
A. Models
B. Activities
C. Experiments
D. Pipelines
E. Deployments
Correct Answer: E
Section: (none)
Explanation
Explanation/Reference:
References:
https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-enable-logging#logging-for-
deployed-models
QUESTION 2
DRAG DROP
You need to design the workflow for an Azure Machine Learning solution. The solution must meet the
following requirements:
Retrieve data from file shares, Microsoft SQL Server databases, and Oracle databases that in an on-
premises network.
Use an Apache Spark job to process data stored in an Azure SQL Data Warehouse database.
Which service should you use to meet each requirement? To answer, drag the appropriate services to the
correct requirements. Each service may be used once, more than once, or not at all. You may need to drag
the split bar between panes or scroll to view content.
Correct Answer:
FF3B4C4270CB86445E2DB637EF5B166D
Section: (none)
Explanation
Explanation/Reference:
References:
https://docs.microsoft.com/en-us/azure/machine-learning/studio/use-data-from-an-on-premises-sql-server
https://docs.microsoft.com/en-in/azure/azure-databricks/what-is-azure-databricks
QUESTION 3
DRAG DROP
Correct Answer:
FF3B4C4270CB86445E2DB637EF5B166D
Section: (none)
Explanation
Explanation/Reference:
References:
https://azure.microsoft.com/en-in/blog/experimentation-using-azure-machine-learning/
https://docs.microsoft.com/en-us/azure/machine-learning/studio-module-reference/machine-learning-
modules
QUESTION 4
You have an app that records meetings by using speech-to-text capabilities from the Speech Services API.
You discover that when action items are listed at the end of each meeting, the app transcribes the text
inaccurately.
Correct Answer: A
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Speech Services API with subscribtion to the Microsoft Text Translation API enables you to use Custom
Translator to use your own data for more accurate translations.
References:
FF3B4C4270CB86445E2DB637EF5B166D
https://www.microsoft.com/en-us/translator/business/customization/
QUESTION 5
DRAG DROP
You have a real-time scoring pattern that uses deep learning models in Azure.
What should you use? To answer, drag the appropriate Azure services to the correct locations in the
scoring pattern. Each service may be used once, more than once, or not at all. You may need to drag the
split bat between panes or scroll to view content.
Correct Answer:
Section: (none)
Explanation
FF3B4C4270CB86445E2DB637EF5B166D
Explanation/Reference:
Explanation:
This reference architecture shows how to deploy Python models as web services to make real-time
predictions using the Azure Machine Learning.
References:
https://docs.microsoft.com/sv-se/azure/architecture/reference-architectures/ai/realtime-scoring-python
QUESTION 6
DRAG DROP
You plan to use the Microsoft Bot Framework to develop bots that will be deployed by using the Azure Bot
Service.
You need to configure the Azure Bot Service to support the following types of bots:
Which template should you use for each bot type? To answer drag the appropriate templates to the correct
bot type. Each template may be used once, more than once or not at all. You may need to drag the split bar
between panes or scroll to view content.
FF3B4C4270CB86445E2DB637EF5B166D
Correct Answer:
Section: (none)
Explanation
Explanation/Reference:
References:
https://docs.microsoft.com/en-us/azure/bot-service/bot-service-concept-templates?view=azure-bot-service-
3.0
QUESTION 7
You have Azure IoT Edge devices that collect measurements every 30 seconds.
FF3B4C4270CB86445E2DB637EF5B166D
What should you use?
A. Apache Kafka
B. Azure Stream Analytics record functions
C. Azure Stream Analytics windowing functions
D. Azure Machine Learning on the IoT Edge devices
Correct Answer: D
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Use Azure Notebooks to develop a machine learning module and deploy it to a Linux device running Azure
IoT Edge.
You can use IoT Edge modules to deploy code that implements your business logic directly to your IoT
Edge devices.
References:
https://docs.microsoft.com/en-us/azure/iot-edge/tutorial-deploy-machine-learning
QUESTION 8
HOTSPOT
You need to build a real-time media bot for Microsoft Skype on an Azure virtual machine. The bot will use
the Azure Bot Service. The solution must minimize custom code.
Which environment and language should you use to develop the bot? To answer, select the appropriate
options in the answer area.
Hot Area:
Correct Answer:
FF3B4C4270CB86445E2DB637EF5B166D
Section: (none)
Explanation
Explanation/Reference:
References:
https://docs.microsoft.com/en-us/microsoftteams/platform/concepts/calls-and-meetings/requirements-
considerations-application-hosted-media-bots
QUESTION 9
Your company recently purchased several hundred hardware devices that contain sensors.
You need to recommend a solution to process the sensor data. The solution must provide the ability to write
back configuration changes to the devices.
Correct Answer: A
Section: (none)
Explanation
Explanation/Reference:
References:
https://azure.microsoft.com/en-us/resources/samples/functions-js-iot-hub-processing/
QUESTION 10
You have thousands of images that contain text.
You need to process the text from the images to a machine-readable character stream.
FF3B4C4270CB86445E2DB637EF5B166D
B. Text Analytics
C. Translator Text
D. Computer Vision
Correct Answer: D
Section: (none)
Explanation
Explanation/Reference:
Explanation:
With Computer Vision you can detect text in an image using optical character recognition (OCR) and
extract the recognized words into a machine-readable character stream.
Incorrect Answers:
A: Use Content Moderator’s machine-assisted image moderation and human-in-the-loop Review tool to
moderate images for adult and racy content. Scan images for text content and extract that text, and detect
faces. You can match images against custom lists, and take further action.
Reference:
https://azure.microsoft.com/en-us/services/cognitive-services/computer-vision/
https://docs.microsoft.com/en-us/azure/cognitive-services/content-moderator/image-moderation-api
QUESTION 11
You have Azure IoT Edge devices that collect measurements every 30 seconds.
You need to process events in the cloud and account for missing data.
A. Apache Kafka
B. Azure Stream Analytics record functions
C. Azure Stream Analytics windowing functions
D. Azure Machine Learning on the IoT Edge devices
Correct Answer: D
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Use Azure Notebooks to develop a machine learning module and deploy it to a Linux device running Azure
IoT Edge.
You can use IoT Edge modules to deploy code that implements your business logic directly to your IoT
Edge devices.
Use Clean Missing Data module in Azure Machine Learning to to remove, replace, or infer missing values.
Reference:
https://docs.microsoft.com/en-us/azure/iot-edge/tutorial-deploy-machine-learning
QUESTION 12
Note: This question is part of a series of questions that present the same scenario. Each question
in the series contains a unique solution that might meet the stated goals. Some question sets might
have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will
not appear in the review screen.
You are deploying an Azure Machine Learning model to an Azure Kubernetes Service (AKS) container.
FF3B4C4270CB86445E2DB637EF5B166D
You need to monitor the scoring accuracy of each run of the model.
A. Yes
B. No
Correct Answer: B
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Instead update the manifest file.
Reference:
https://azure.github.io/learnAnalytics-UsingAzureMachineLearningforAIWorkloads/lab07-
deploying_a_scoring_service_to_aks/0_README.html
QUESTION 13
You need to build an API pipeline that analyzes streaming data. The pipeline will perform the following:
Correct Answer: D
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Azure Video Indexer is a cloud application built on Azure Media Analytics, Azure Search, Cognitive Services
(such as the Face API, Microsoft Translator, the Computer Vision API, and Custom Speech Service). It
enables you to extract the insights from your videos using Video Indexer video and audio models described
below:
Visual text recognition (OCR): Extracts text that is visually displayed in the video.
Audio transcription: Converts speech to text in 12 languages and allows extensions.
Sentiment analysis: Identifies positive, negative, and neutral sentiments from speech and visual text.
Face detection: Detects and groups faces appearing in the video.
References:
https://docs.microsoft.com/en-us/azure/media-services/video-indexer/video-indexer-overview
QUESTION 14
You design an AI solution that uses an Azure Stream Analytics job to process data from an Azure IoT hub.
The IoT hub receives time series data from thousands of IoT devices at a factory.
The job outputs millions of messages per second. Different applications consume the messages as they
are available. The messages must be purged.
FF3B4C4270CB86445E2DB637EF5B166D
What is the best output type to achieve the goal? More than one answer choice may achieve the goal.
Correct Answer: A
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Stream Analytics can target Azure Cosmos DB for JSON output, enabling data archiving and low-latency
queries on unstructured JSON data.
References:
https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-documentdb-output
QUESTION 15
HOTSPOT
You are designing an AI solution that must meet the following processing requirements:
Use a parallel processing framework that supports the in-memory processing of high volumes of data.
Use in-memory caching and a columnar storage engine for Apache Hive queries.
What should you use to meet each requirement? To answer, select the appropriate options in the answer
area.
Hot Area:
Correct Answer:
FF3B4C4270CB86445E2DB637EF5B166D
Section: (none)
Explanation
Explanation/Reference:
Explanation:
References:
https://docs.microsoft.com/en-us/azure/hdinsight/spark/apache-spark-overview
https://docs.microsoft.com/bs-latn-ba/azure/hdinsight/interactive-query/apache-interactive-query-get-started
QUESTION 16
You need to deploy cognitive search.
Correct Answer: D
Section: (none)
Explanation
Explanation/Reference:
FF3B4C4270CB86445E2DB637EF5B166D
Explanation:
You create a data source, a skillset, and an index. These three components become part of an indexer that
pulls each piece together into a single multi-phased operation.
Note: At the start of the pipeline, you have unstructured text or non-text content (such as image and
scanned document JPEG files). Data must exist in an Azure data storage service that can be accessed by
an indexer. Indexers can "crack" source documents to extract text from source data.
References:
https://docs.microsoft.com/en-us/azure/search/cognitive-search-tutorial-blob
QUESTION 17
You need to design an application that will analyze real-time data from financial feeds.
The data will be ingested into Azure IoT Hub. The data must be processed as quickly as possible in the
order in which it is ingested.
Correct Answer: C
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Stream processing can be handled by Azure Stream Analytics. Azure Stream Analytics can run perpetual
queries against an unbounded stream of data. These queries consume streams of data from storage or
message brokers, filter and aggregate the data based on temporal windows, and write the results to sinks
such as storage, databases, or directly to reports in Power BI. Stream Analytics uses a SQL-based query
language that supports temporal and geospatial constructs, and can be extended using JavaScript.
Incorrect Answers:
E: Apache Kafka is used for ingestion, not for stream processing.
F: Azure Event Hubs is used for ingestion, not for stream processing.
Reference:
https://docs.microsoft.com/en-us/azure/architecture/data-guide/big-data/real-time-processing
QUESTION 18
You are designing an AI solution that will provide feedback to teachers who train students over the Internet.
The students will be in classrooms located in remote areas. The solution will capture video and audio data
of the students in the classrooms.
You need to recommend Azure Cognitive Services for the AI solution to meet the following requirements:
FF3B4C4270CB86445E2DB637EF5B166D
E. Video Indexer, Speech to Text, and Face API
Correct Answer: E
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Azure Video Indexer is a cloud application built on Azure Media Analytics, Azure Search, Cognitive Services
(such as the Face API, Microsoft Translator, the Computer Vision API, and Custom Speech Service). It
enables you to extract the insights from your videos using Video Indexer video and audio models.
Face API enables you to search, identify, and match faces in your private repository of up to 1 million
people.
The Face API now integrates emotion recognition, returning the confidence across a set of emotions for
each face in the image such as anger, contempt, disgust, fear, happiness, neutral, sadness, and surprise.
These emotions are understood to be cross-culturally and universally communicated with particular facial
expressions.
Speech-to-text from Azure Speech Services, also known as speech-to-text, enables real-time transcription
of audio streams into text that your applications, tools, or devices can consume, display, and take action on
as command input. This service is powered by the same recognition technology that Microsoft uses for
Cortana and Office products, and works seamlessly with the translation and text-to-speech.
Incorrect Answers:
Computer Vision or the QnA is not required.
References:
https://docs.microsoft.com/en-us/azure/media-services/video-indexer/video-indexer-overview
https://azure.microsoft.com/en-us/services/cognitive-services/face/
https://docs.microsoft.com/en-us/azure/cognitive-services/speech-service/speech-to-text
QUESTION 19
You create an Azure Cognitive Services resource.
A developer needs to be able to retrieve the keys used by the resource. The solution must use the principle
of least privilege.
What is the best role to assign to the developer? More than one answer choice may achieve the goal.
A. Security Manager
B. Security Reader
C. Cognitive Services Contributor
D. Cognitive Services User
Correct Answer: D
Section: (none)
Explanation
Explanation/Reference:
Explanation:
The Cognitive Services User lets you read and list keys of Cognitive Services.
References:
https://docs.microsoft.com/en-us/azure/role-based-access-control/built-in-roles
QUESTION 20
Your company plans to deploy an AI solution that processes IoT data in real-time.
You need to recommend a solution for the planned deployment that meets the following requirements:
FF3B4C4270CB86445E2DB637EF5B166D
Sustain up to 50 Mbps of events without throttling.
Retain data for 60 days.
A. Apache Kafka
B. Microsoft Azure IoT Hub
C. Microsoft Azure Data Factory
D. Microsoft Azure Machine Learning
Correct Answer: A
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Apache Kafka is an open-source distributed streaming platform that can be used to build real-time
streaming data pipelines and applications.
References:
https://docs.microsoft.com/en-us/azure/hdinsight/kafka/apache-kafka-introduction
QUESTION 21
You are designing a solution that will use the Azure Content Moderator service to moderate user-generated
content.
You need to moderate custom predefined content without repeatedly scanning the collected content.
Which two APIs should you use? Each correct answer presents part of the solution.
Correct Answer: AC
Section: (none)
Explanation
Explanation/Reference:
Explanation:
The default global list of terms in Azure Content Moderator is sufficient for most content moderation needs.
However, you might need to screen for terms that are specific to your organization. For example, you might
want to tag competitor names for further review.
Use the List Management API to create custom lists of terms to use with the Text Moderation API. The Text
- Screen operation scans your text for profanity, and also compares text against custom and shared
blacklists.
C: Use Content Moderator’s machine-assisted image moderation and human-in-the-loop Review tool to
moderate images for adult and racy content.
Instead of moderating the same image multiple times, you add the offensive images to your custom list of
blocked content. That way, your content moderation system compares incoming images against your
custom lists and stops any further processing.
Incorrect Answers:
B: Use the Text Moderation API in Azure Content Moderator to scan your text content. The operation scans
your content for profanity, and compares the content against custom and shared blacklists.
FF3B4C4270CB86445E2DB637EF5B166D
References:
https://docs.microsoft.com/en-us/azure/cognitive-services/content-moderator/try-terms-list-api
https://docs.microsoft.com/en-us/azure/cognitive-services/content-moderator/image-moderation-api
FF3B4C4270CB86445E2DB637EF5B166D
Integrate AI models into solutions
Testlet 2
Overview
Contoso, Ltd. has an office in New York to serve its North American customers and an office in Paris to
serve its European customers.
Existing Environment
Infrastructure
Each office has a small data center that hosts Active Directory services and a few off-the-shelf software
solutions used by internal users.
The network contains a single Active Directory forest that contains a single domain named contoso.com.
Azure Active Directory (Azure AD) Connect is used to extend identity management to Azure.
The company has an Azure subscription. Each office has an Azure ExpressRoute connection to the
subscription. The New York office connects to a virtual network hosted in the US East 2 Azure region. The
Paris office connects to a virtual network hosted in the West Europe Azure region.
The New York office has an Azure Stack Development Kit (ASDK) deployment that is used for development
and testing.
Contoso has a web app named Bookings hosted in an App Service Environment (ASE). The ASE is in the
virtual network in the East US 2 region. Contoso employees and customers use Bookings to reserve hotel
rooms.
Data Environment
Bookings connects to a Microsoft SQL Server database named hotelDB in the New York office.
The database has a view named vwAvailability that consolidates columns from the tables named Hotels,
Rooms, and RoomAvailability. The database contains data that was collected during the last 20 years.
Problem Statements
Contoso identifies the following issues with its current business model:
European users report that access to Booking is slow, and they lose customers who must wait on the
phone while they search for available rooms.
Users report that Bookings was unavailable during an outage in the New York data center for more than
24 hours.
Requirements
Contoso identifies the following issues with its current business model:
European users report that access to Bookings is slow, and they lose customers who must wait on the
phone while they search for available rooms.
Users report that Bookings was unavailable during on outage in the New York data center for more than
24 hours.
Business Goals
Contoso wants to provide a new version of the Bookings app that will provide a highly available, reliable
service for booking travel packages by interacting with a chatbot named Butler.
Technical requirements
FF3B4C4270CB86445E2DB637EF5B166D
Contoso identifies the following technical requirements:
QUESTION 1
Which two services should be implemented so that Butler can find available rooms on the technical
requirements? Each correct answer presents part of the solution.
A. QnA Maker
B. Bing Entity Search
C. Language Understanding (LUIS)
D. Azure Search
E. Content Moderator
Correct Answer: AC
Section: (none)
Explanation
Explanation/Reference:
References:
https://azure.microsoft.com/en-in/services/cognitive-services/language-understanding-intelligent-service/
QUESTION 2
DRAG DROP
You need to integrate the new Bookings app and the Butler chabot.
Which four actions should you perform in sequence? To answer, move the appropriate actions from the list
of actions to the answer area and arrange them in the correct order.
FF3B4C4270CB86445E2DB637EF5B166D
Correct Answer:
FF3B4C4270CB86445E2DB637EF5B166D
Section: (none)
Explanation
Explanation/Reference:
References:
https://docs.microsoft.com/en-us/azure/bot-service/bot-service-channel-connect-webchat?view=azure-bot-
service-4.0
QUESTION 3
You need to meet the greeting requirements for Butler.
A. AdaptiveCard
B. SigninCard
C. CardCarousel
D. HeroCard
Correct Answer: D
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Scenario: Butler must greet users by name when they first connect.
HeroCard defines a card with a large image, title, text, and action buttons.
Incorrect Answers:
FF3B4C4270CB86445E2DB637EF5B166D
B: SigninCard defines a card that lets a user sign in to a service.
References:
https://docs.microsoft.com/en-us/azure/bot-service/bot-builder-send-welcome-message
FF3B4C4270CB86445E2DB637EF5B166D
Deploy and manage solutions
Question Set 1
QUESTION 1
Note: This question is part of a series of questions that present the same scenario. Each question
in the series contains a unique solution that might meet the stated goals. Some question sets might
have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will
not appear in the review screen.
A. Yes
B. No
Correct Answer: A
Section: (none)
Explanation
Explanation/Reference:
Explanation:
You need to enable Model data collection.
References:
https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-enable-data-collection
QUESTION 2
Note: This question is part of a series of questions that present the same scenario. Each question
in the series contains a unique solution that might meet the stated goals. Some question sets might
have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will
not appear in the review screen.
A. Yes
B. No
Correct Answer: B
Section: (none)
Explanation
Explanation/Reference:
FF3B4C4270CB86445E2DB637EF5B166D
Explanation:
You need to enable Model data collection.
References:
https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-enable-data-collection
QUESTION 3
Note: This question is part of a series of questions that present the same scenario. Each question
in the series contains a unique solution that might meet the stated goals. Some question sets might
have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will
not appear in the review screen.
A. Yes
B. No
Correct Answer: B
Section: (none)
Explanation
Explanation/Reference:
Explanation:
You need to enable Model data collection.
References:
https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-enable-data-collection
QUESTION 4
Your company has recently purchased and deployed 25,000 IoT devices.
You need to recommend a data analysis solution for the devices that meets the following requirements:
Correct Answer: C
Section: (none)
Explanation
Explanation/Reference:
Explanation:
An IoT hub has a default built-in endpoint. You can create custom endpoints to route messages to by
linking other services in your subscription to the hub.
FF3B4C4270CB86445E2DB637EF5B166D
Individual devices connect using credentials stored in the IoT hub's identity registry.
References:
https://docs.microsoft.com/en-us/azure/iot-hub/iot-hub-devguide-security
QUESTION 5
You create an Azure Machine Learning Studio experiment.
You need to ensure that you can consume the web service from Microsoft Excel spreadsheets.
Correct Answer: D
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Steps to Add a New web service
1. Deploy a web service or use an existing Web service.
2. Click Consume.
3. Look for the Basic consumption info section. Copy and save the Primary Key and the Request-
Response URL.
4. In Excel, go to the Web Services section (if you are in the Predict section, click the back arrow to go to
the list of web services).
5. Click Add Web Service.
6. Paste the URL into the Excel add-in text box labeled URL.
7. Paste the API/Primary key into the text box labeled API key.
8. Click Add.
References:
https://docs.microsoft.com/en-us/azure/machine-learning/studio/excel-add-in-for-web-services
QUESTION 6
DRAG DROP
Which four actions should you perform in sequence? To answer, move the appropriate actions from the list
of actions to the answer area and arrange them in the correct order.
FF3B4C4270CB86445E2DB637EF5B166D
Correct Answer:
Section: (none)
Explanation
Explanation/Reference:
Explanation:
FF3B4C4270CB86445E2DB637EF5B166D
The deployment workflow includes the following steps:
1. Register the model in a registry hosted in your Azure Machine Learning Service workspace
2. Register an image that pairs a model with a scoring script and dependencies in a portable container
3. Deploy the image as a web service in the cloud or to edge devices
4. Monitor and collect data
5. Update a deployment to use a new image.
References:
https://docs.microsoft.com/bs-latn-ba/azure/machine-learning/service/concept-model-management-and-
deployment#step-3-deploy-image
QUESTION 7
You are building an Azure Analysis Services cube for your AI deployment.
The source data for the cube is located in an on premises network in a Microsoft SQL Server database.
You need to ensure that the Azure Analysis Services service can access the source data.
A. a site-to-site VPN
B. a data gateway
C. Azure Data Factory
D. a network gateway
Correct Answer: B
Section: (none)
Explanation
Explanation/Reference:
Explanation:
From April 2017 onward we can use On-premises Data Gateway for Azure Analysis Services. This means
you can connect your Tabular Models hosted in Azure Analysis Services to your on-premises data sources
through On-premises Data Gateway.
FF3B4C4270CB86445E2DB637EF5B166D
References:
https://biinsight.com/on-premises-data-gateway-for-azure-analysis-services/
QUESTION 8
DRAG DROP
You develop a custom application that uses a token to connect to Azure Cognitive Services resources.
A new security policy requires that all access keys are changed every 30 days.
Which three actions should you recommend be performed every 30 days? To answer, move the
appropriate actions from the list of actions to the answer area and arrange them in the correct order.
Correct Answer:
FF3B4C4270CB86445E2DB637EF5B166D
Section: (none)
Explanation
Explanation/Reference:
Explanation:
References:
FF3B4C4270CB86445E2DB637EF5B166D
https://docs.microsoft.com/en-us/azure/cognitive-services/authentication
QUESTION 9
DRAG DROP
You use an Azure key vault to store credentials for several Azure Machine Learning applications.
You need to configure the key vault to meet the following requirements:
Ensure that the IT security team can add new passwords and periodically change the passwords.
Ensure that the applications can securely retrieve the passwords for the applications.
Use the principle of least privilege.
Which permissions should you grant? To answer, drag the appropriate permissions to the correct targets.
Each permission may be used once, more than once, or not at all. You may need to drag the split bar
between panes or scroll to view content.
Correct Answer:
FF3B4C4270CB86445E2DB637EF5B166D
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Incorrect Answers:
Not Keys as they are used for encryption only.
References:
https://docs.microsoft.com/en-us/azure/key-vault/key-vault-secure-your-key-vault
QUESTION 10
A data scientist deploys a deep learning model on an Fsv2 virtual machine.
You need to recommend which virtual machine series the data scientist must use to ensure that data
analysis occurs as quickly as possible.
A. ND
B. B
C. DC
D. Ev3
Correct Answer: A
Section: (none)
Explanation
Explanation/Reference:
Explanation:
The N-series is a family of Azure Virtual Machines with GPU capabilities. GPUs are ideal for compute and
FF3B4C4270CB86445E2DB637EF5B166D
graphics-intensive workloads, helping customers to fuel innovation through scenarios like high-end remote
visualisation, deep learning and predictive analytics.
The ND-series is focused on training and inference scenarios for deep learning. It uses the NVIDIA Tesla
P40 GPUs. The latest version - NDv2 - features the NVIDIA Tesla V100 GPUs.
References:
https://azure.microsoft.com/en-in/pricing/details/virtual-machines/series/
QUESTION 11
DRAG DROP
You are designing a solution that uses drones to monitor remote locations for anomalies. The drones have
Azure IoT Edge devices. The solution must meet the following requirements:
Email a user the picture and location of an anomaly when an anomaly is detected.
Use a video stream to detect anomalies at the location.
Send the pictures and location information to Azure.
Use the latest amount of code possible.
You develop a custom vision Azure Machine Learning module to detect the anomalies.
Which service should you use for each requirement? To answer, drag the appropriate services to the
correct requirements. Each service may be used once, more than once, or not at all. You may need to drag
the split bar between panes or scroll to view content.
Correct Answer:
FF3B4C4270CB86445E2DB637EF5B166D
Section: (none)
Explanation
Explanation/Reference:
Explanation:
You configure the Remote Monitoring solution to respond to anomalies detected by an IoT Edge device. IoT
Edge devices let you process telemetry at the edge to reduce the volume of telemetry sent to the solution
and to enable faster responses to events on devices.
References:
https://docs.microsoft.com/en-us/azure/iot-accelerators/iot-accelerators-remote-monitoring-edge
QUESTION 12
You have Azure IoT Edge devices that generate measurement data from temperature sensors. The data
changes very slowly.
FF3B4C4270CB86445E2DB637EF5B166D
You need to analyze the data in a temporal two-minute window. If the temperature rises five degrees above
a limit, an alert must be raised. The solution must minimize the development of custom code.
Correct Answer: C
Section: (none)
Explanation
Explanation/Reference:
References:
https://docs.microsoft.com/en-us/azure/iot-edge/tutorial-deploy-stream-analytics
QUESTION 13
Note: This question is part of a series of questions that present the same scenario. Each question
in the series contains a unique solution that might meet the stated goals. Some question sets might
have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will
not appear in the review screen.
You are deploying an Azure Machine Learning model to an Azure Kubernetes Service (AKS) container.
You need to monitor the scoring accuracy of each run of the model.
A. Yes
B. No
Correct Answer: B
Section: (none)
Explanation
Explanation/Reference:
QUESTION 14
Note: This question is part of a series of questions that present the same scenario. Each question
in the series contains a unique solution that might meet the stated goals. Some question sets might
have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will
not appear in the review screen.
You are deploying an Azure Machine Learning model to an Azure Kubernetes Service (AKS) container.
You need to monitor the scoring accuracy of each run of the model.
A. Yes
FF3B4C4270CB86445E2DB637EF5B166D
B. No
Correct Answer: B
Section: (none)
Explanation
Explanation/Reference:
QUESTION 15
You need to create a new app that will consume resources from the following Azure Cognitive Services
APIs:
Face API
Bing Search
Text Analytics
Translator Text
Language Understanding (LUIS)
Correct Answer: B
Section: (none)
Explanation
Explanation/Reference:
Explanation:
After creating a Cognitive Service resource in the Azure portal, you'll get an endpoint and a key for
authenticating your applications.
FF3B4C4270CB86445E2DB637EF5B166D
References:
https://docs.microsoft.com/en-us/azure/cognitive-services/cognitive-services-apis-create-account
QUESTION 16
The development team at your company builds a bot by using C# and .NET.
Correct Answer: B
Section: (none)
Explanation
Explanation/Reference:
Explanation:
The deployment process documented here uses one of the ARM templates to provision required resources
for the bot in Azure by using the Azure CLI.
Note: When you create a bot using the Visual Studio template, Yeoman template, or Cookiecutter template
the source code generated includes a deploymentTemplates folder that contains ARM templates.
References:
https://docs.microsoft.com/en-us/azure/bot-service/bot-builder-deploy-az-cli
FF3B4C4270CB86445E2DB637EF5B166D
QUESTION 17
Note: This question is part of a series of questions that present the same scenario. Each question
in the series contains a unique solution that might meet the stated goals. Some question sets might
have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will
not appear in the review screen.
You are deploying an Azure Machine Learning model to an Azure Kubernetes Service (AKS) container.
You need to monitor the scoring accuracy of each run of the model.
A. Yes
B. No
Correct Answer: A
Section: (none)
Explanation
Explanation/Reference:
References:
https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-enable-data-collection
https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-enable-app-insights
QUESTION 18
Note: This question is part of a series of questions that present the same scenario. Each question
in the series contains a unique solution that might meet the stated goals. Some question sets might
have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will
not appear in the review screen.
A. Yes
B. No
Correct Answer: B
Section: (none)
Explanation
Explanation/Reference:
Explanation:
You need to enable Model data collection.
References:
https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-enable-data-collection
QUESTION 19
You deploy an Azure bot.
FF3B4C4270CB86445E2DB637EF5B166D
You need to collect Key Performance Indicator (KPI) data from the bot. The type of data includes:
A. Bot analytics
B. Azure Monitor
C. Azure Analysis Services
D. Azure Application Insights
Correct Answer: A
Section: (none)
Explanation
Explanation/Reference:
References:
https://docs.microsoft.com/en-us/azure/bot-service/bot-service-manage-analytics?view=azure-bot-service-
4.0
QUESTION 20
Your company develops an API application that is orchestrated by using Kubernetes.
You need to deploy the application.
Which three actions should you perform? Each correct answer presents part of the solution.
Explanation/Reference:
References:
https://docs.microsoft.com/en-us/azure/aks/tutorial-kubernetes-prepare-app
QUESTION 21
Note: This question is part of a series of questions that present the same scenario. Each question
in the series contains a unique solution that might meet the stated goals. Some question sets might
have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will
not appear in the review screen.
You have Azure IoT Edge devices that generate streaming data.
On the devices, you need to detect anomalies in the data by using Azure Machine Learning models. Once
an anomaly is detected, the devices must add information about the anomaly to the Azure IoT Hub stream.
Solution: You deploy an Azure Machine Learning model as an IoT Edge module.
FF3B4C4270CB86445E2DB637EF5B166D
Does this meet the goal?
A. Yes
B. No
Correct Answer: A
Section: (none)
Explanation
Explanation/Reference:
Explanation:
You can use IoT Edge modules to deploy code that implements your business logic directly to your IoT
Edge devices. For example, you can deploy an Azure Machine Learning module that predicts when a
device fails based on simulated machine temperature data.
References:
https://docs.microsoft.com/bs-latn-ba/azure/iot-edge/tutorial-deploy-machine-learning
QUESTION 22
You plan to deploy a global healthcare app named App1 to Azure.
App1 will use Azure Cognitive Services APIs. Users in Germany, Canada, and the United States will
connect to App1.
You need to recommend an app deployment solution to ensure that all the personal data of the users
remain in their country or origin only.
Which three Azure services should you recommend deploying to each Azure region? Each correct answer
presents part of the solution.
Explanation/Reference:
References:
https://github.com/microsoft/computerscience/blob/master/Labs/Azure%20Services/Azure%20Storage/
Azure%20Storage%20and%20Cognitive%20Services%20(MVC).md
QUESTION 23
DRAG DROP
You are designing an AI solution that will use IoT devices to gather data from conference attendees and
then analyze the data. The IoT device will connect to an Azure IoT hub.
You need to ensure that data contains no personally identifiable information before it is sent to the IoT hub.
Which three actions should you perform in sequence? To answer, move the appropriate actions from the
list of actions to the answer area and arrange them in the correct order.
FF3B4C4270CB86445E2DB637EF5B166D
Correct Answer:
Section: (none)
Explanation
Explanation/Reference:
Note:
ASA Edge jobs run in containers deployed to Azure IoT Edge devices. They are composed of two parts:
1. A cloud part that is responsible for job definition: users define inputs, output, query, and other settings
(out of order events, etc.) in the cloud.
2. A module running on your IoT devices. It contains the ASA engine and receives the job definition from
the cloud.
References:
https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-edge
QUESTION 24
You have an Azure Machine Learning experiment that must comply with GDPR regulations.
You need to track compliance of the experiment and store documentation about the experiment.
FF3B4C4270CB86445E2DB637EF5B166D
D. Compliance Manager
Correct Answer: D
Section: (none)
Explanation
Explanation/Reference:
References:
https://azure.microsoft.com/en-us/blog/new-capabilities-to-enable-robust-gdpr-compliance/
QUESTION 25
You are developing an application that will perform optical character recognition of photos of medical
logbooks.
You need to recommend a solution to validate the data against a validated set of records.
Correct Answer: D
Section: (none)
Explanation
Explanation/Reference:
References:
https://docs.microsoft.com/en-us/sql/master-data-services/validation-master-data-services?view=sql-
server-2017
QUESTION 26
You are designing an AI application that will perform real-time processing by using Microsoft Azure Stream
Analytics.
What are three possible outputs? Each correct answer presents a complete solution.
Explanation/Reference:
References:
https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-define-outputs
QUESTION 27
Your company has an Azure subscription that contains an Azure Active Directory (Azure AD) tenant.
Azure AD contains 500 user accounts for your company’s employees. Some temporary employees do NOT
have user accounts in Azure AD.
FF3B4C4270CB86445E2DB637EF5B166D
You are designing a storage solution for video files and metadata files.
You need to recommend an authentication solution to provide links to the video files. The solution must
provide access to each file for only five minutes.
Correct Answer: C
Section: (none)
Explanation
Explanation/Reference:
References:
https://docs.microsoft.com/en-us/azure/storage/common/storage-dotnet-shared-access-signature-part-1
QUESTION 28
DRAG DROP
You have a container image that contains an AI solution. The solution will be used on demand and will only
be needed a few hours each month.
You need to recommend the deployment process. The solution must minimize costs.
Which four actions should you recommend Azure Functions perform in sequence? To answer, move the
appropriate actions from the list of actions to the answer area and arrange them in the correct order.
FF3B4C4270CB86445E2DB637EF5B166D
Correct Answer:
FF3B4C4270CB86445E2DB637EF5B166D
Section: (none)
Explanation
Explanation/Reference:
QUESTION 29
Your company plans to implement an AI solution that will analyze data from IoT devices.
Data from the devices will be analyzed in real time. The results of the analysis will be stored in a SQL
database.
You need to recommend a data processing solution that uses the Transact-SQL language.
Correct Answer: A
Section: (none)
Explanation
Explanation/Reference:
References:
https://www.linkedin.com/pulse/getting-started-azure-iot-services-stream-analytics-rob-tiffany
QUESTION 30
FF3B4C4270CB86445E2DB637EF5B166D
Note: This question is part of a series of questions that present the same scenario. Each question
in the series contains a unique solution that might meet the stated goals. Some question sets might
have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will
not appear in the review screen.
You have Azure IoT Edge devices that generate streaming data.
On the devices, you need to detect anomalies in the data by using Azure Machine Learning models. Once
an anomaly is detected, the devices must add information about the anomaly to the Azure IoT Hub stream.
A. Yes
B. No
Correct Answer: B
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Instead use Azure Stream Analytics and REST API.
Note. Available in both the cloud and Azure IoT Edge, Azure Stream Analytics offers built-in machine
learning based anomaly detection capabilities that can be used to monitor the two most commonly
occurring anomalies: temporary and persistent.
Stream Analytics supports user-defined functions, via REST API, that call out to Azure Machine Learning
endpoints.
References:
https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-machine-learning-anomaly-
detection
QUESTION 31
Note: This question is part of a series of questions that present the same scenario. Each question
in the series contains a unique solution that might meet the stated goals. Some question sets might
have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will
not appear in the review screen.
You have Azure IoT Edge devices that generate streaming data.
On the devices, you need to detect anomalies in the data by using Azure Machine Learning models. Once
an anomaly is detected, the devices must add information about the anomaly to the Azure IoT Hub stream.
A. Yes
B. No
Correct Answer: A
Section: (none)
Explanation
Explanation/Reference:
FF3B4C4270CB86445E2DB637EF5B166D
Explanation:
Available in both the cloud and Azure IoT Edge, Azure Stream Analytics offers built-in machine learning
based anomaly detection capabilities that can be used to monitor the two most commonly occurring
anomalies: temporary and persistent.
Stream Analytics supports user-defined functions, via REST API, that call out to Azure Machine Learning
endpoints.
References:
https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-machine-learning-anomaly-
detection
QUESTION 32
Note: This question is part of a series of questions that present the same scenario. Each question
in the series contains a unique solution that might meet the stated goals. Some question sets might
have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will
not appear in the review screen.
You have Azure IoT Edge devices that generate streaming data.
On the devices, you need to detect anomalies in the data by using Azure Machine Learning models. Once
an anomaly is detected, the devices must add information about the anomaly to the Azure IoT Hub stream.
A. Yes
B. No
Correct Answer: B
Section: (none)
Explanation
Explanation/Reference:
Explanation:
Instead use Azure Stream Analytics and REST API.
Note. Available in both the cloud and Azure IoT Edge, Azure Stream Analytics offers built-in machine
learning based anomaly detection capabilities that can be used to monitor the two most commonly
occurring anomalies: temporary and persistent.
Stream Analytics supports user-defined functions, via REST API, that call out to Azure Machine Learning
endpoints.
References:
https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-machine-learning-anomaly-
detection
FF3B4C4270CB86445E2DB637EF5B166D
Deploy and manage solutions
Testlet 2
Overview
Contoso, Ltd. has an office in New York to serve its North American customers and an office in Paris to
serve its European customers.
Existing Environment
Infrastructure
Each office has a small data center that hosts Active Directory services and a few off-the-shelf software
solutions used by internal users.
The network contains a single Active Directory forest that contains a single domain named contoso.com.
Azure Active Directory (Azure AD) Connect is used to extend identity management to Azure.
The company has an Azure subscription. Each office has an Azure ExpressRoute connection to the
subscription. The New York office connects to a virtual network hosted in the US East 2 Azure region. The
Paris office connects to a virtual network hosted in the West Europe Azure region.
The New York office has an Azure Stack Development Kit (ASDK) deployment that is used for development
and testing.
Contoso has a web app named Bookings hosted in an App Service Environment (ASE). The ASE is in the
virtual network in the East US 2 region. Contoso employees and customers use Bookings to reserve hotel
rooms.
Data Environment
Bookings connects to a Microsoft SQL Server database named hotelDB in the New York office.
The database has a view named vwAvailability that consolidates columns from the tables named Hotels,
Rooms, and RoomAvailability. The database contains data that was collected during the last 20 years.
Problem Statements
Contoso identifies the following issues with its current business model:
European users report that access to Booking is slow, and they lose customers who must wait on the
phone while they search for available rooms.
Users report that Bookings was unavailable during an outage in the New York data center for more than 24
hours.
Requirements
Contoso identifies the following issues with its current business model:
European users report that access to Bookings is slow, and they lose customers who must wait on the
phone while they search for available rooms.
Users report that Bookings was unavailable during on outage in the New York data center for more than
24 hours.
Business Goals
Contoso wants to provide a new version of the Bookings app that will provide a highly available, reliable
service for booking travel packages by interacting with a chatbot named Butler.
Technical requirements
FF3B4C4270CB86445E2DB637EF5B166D
Contoso identifies the following technical requirements:
QUESTION 1
Which RBAC role should you assign to the KeyManagers group?
Correct Answer: A
Section: (none)
Explanation
Explanation/Reference:
References:
https://docs.microsoft.com/en-us/azure/role-based-access-control/built-in-roles
FF3B4C4270CB86445E2DB637EF5B166D