DP-203 Exam Question 106

You are monitoring an Azure Stream Analytics job.
The Backlogged Input Events count has been 20 for the last hour.
You need to reduce the Backlogged Input Events count.
What should you do?
  • DP-203 Exam Question 107

    You use Azure Data Lake Storage Gen2 to store data that data scientists and data engineers will query by using Azure Databricks interactive notebooks. Users will have access only to the Data Lake Storage folders that relate to the projects on which they work.
    You need to recommend which authentication methods to use for Databricks and Data Lake Storage to provide the users with the appropriate access. The solution must minimize administrative effort and development effort.
    Which authentication method should you recommend for each Azure service? To answer, select the appropriate options in the answer area.
    NOTE: Each correct selection is worth one point.

    DP-203 Exam Question 108

    You are developing a solution using a Lambda architecture on Microsoft Azure.
    The data at test layer must meet the following requirements:
    Data storage:
    * Serve as a repository (or high volumes of large files in various formats.
    * Implement optimized storage for big data analytics workloads.
    * Ensure that data can be organized using a hierarchical structure.
    Batch processing:
    * Use a managed solution for in-memory computation processing.
    * Natively support Scala, Python, and R programming languages.
    * Provide the ability to resize and terminate the cluster automatically.
    Analytical data store:
    * Support parallel processing.
    * Use columnar storage.
    * Support SQL-based languages.
    You need to identify the correct technologies to build the Lambda architecture.
    Which technologies should you use? To answer, select the appropriate options in the answer area NOTE: Each correct selection is worth one point.

    DP-203 Exam Question 109

    Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
    After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
    You have an Azure Synapse Analytics dedicated SQL pool that contains a table named Table1.
    You have files that are ingested and loaded into an Azure Data Lake Storage Gen2 container named container1.
    You plan to insert data from the files in container1 into Table1 and transform the dat a. Each row of data in the files will produce one row in the serving layer of Table1.
    You need to ensure that when the source data files are loaded to container1, the DateTime is stored as an additional column in Table1.
    Solution: You use an Azure Synapse Analytics serverless SQL pool to create an external table that has an additional DateTime column.
    Does this meet the goal?
  • DP-203 Exam Question 110

    You need to integrate the on-premises data sources and Azure Synapse Analytics. The solution must meet the data integration requirements.
    Which type of integration runtime should you use?