DP-203 Exam Question 1

You have an enterprise data warehouse in Azure Synapse Analytics named DW1 on a server named Server1.
You need to verify whether the size of the transaction log file for each distribution of DW1 is smaller than 160 GB.
What should you do?
  • DP-203 Exam Question 2

    You have an Azure Stream Analytics query. The query returns a result set that contains 10,000 distinct values for a column named clusterID.
    You monitor the Stream Analytics job and discover high latency.
    You need to reduce the latency.
    Which two actions should you perform? Each correct answer presents a complete solution.
    NOTE: Each correct selection is worth one point.
  • DP-203 Exam Question 3

    You plan to create a real-time monitoring app that alerts users when a device travels more than 200 meters away from a designated location.
    You need to design an Azure Stream Analytics job to process the data for the planned app. The solution must minimize the amount of code developed and the number of technologies used.
    What should you include in the Stream Analytics job? To answer, select the appropriate options in the answer area.
    NOTE: Each correct selection is worth one point.

    DP-203 Exam Question 4

    You are designing an Azure Synapse solution that will provide a query interface for the data stored in an Azure Storage account. The storage account is only accessible from a virtual network.
    You need to recommend an authentication mechanism to ensure that the solution can access the source data.
    What should you recommend?
  • DP-203 Exam Question 5

    Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
    After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
    You have an Azure Storage account that contains 100 GB of files. The files contain rows of text and numerical values. 75% of the rows contain description data that has an average length of 1.1 MB.
    You plan to copy the data from the storage account to an enterprise data warehouse in Azure Synapse Analytics.
    You need to prepare the files to ensure that the data copies quickly.
    Solution: You copy the files to a table that has a columnstore index.
    Does this meet the goal?