DP-200 Exam Question 166

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
A company uses Azure Data Lake Gen 1 Storage to store big data related to consumer behavior.
You need to implement logging.
Solution: Create an Azure Automation runbook to copy events.
Does the solution meet the goal?
  • DP-200 Exam Question 167

    Your company uses Microsoft Azure SQL Database configure with Elastic pool. You use Elastic Database jobs to run queries across all databases in the pod.
    You need to analyze, troubleshoot, and report on components responsible for running Elastic Database jobs.
    You need to determine the component responsible for running job service tasks.
    Which components should you use for each Elastic pool job services task? To answer, drag the appropriate component to the correct task. Each component may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
    NOTE: Each correct selection is worth one point.

    DP-200 Exam Question 168

    Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
    After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
    You need to implement diagnostic logging for Data Warehouse monitoring.
    Which log should you use?
  • DP-200 Exam Question 169

    You are a data architect. The data engineering team needs to configure a synchronization of data between an on-premises Microsoft SQL Server database to Azure SQL Database.
    Ad-hoc and reporting queries are being overutilized the on-premises production instance. The synchronization process must:
    Perform an initial data synchronization to Azure SQL Database with minimal downtime
    Perform bi-directional data synchronization after initial synchronization
    You need to implement this synchronization solution.
    Which synchronization method should you use?
  • DP-200 Exam Question 170

    Note: This question is part of series of questions that present the same scenario. Each question in the series contain a unique solution. Determine whether the solution meets the stated goals.
    You develop a data ingestion process that will import data to a Microsoft Azure SQL Data Warehouse. The data to be ingested resides in parquet files stored in an Azure Data Lake Gen 2 storage account.
    You need to load the data from the Azure Data Lake Gen 2 storage account into the Azure SQL Data Warehouse.
    Solution:
    1. Use Azure Data Factory to convert the parquet files to CSV files
    2. Create an external data source pointing to the Azure storage account
    3. Create an external file format and external table using the external data source
    4. Load the data using the INSERT...SELECT statement
    Does the solution meet the goal?