Online Access Free 70-775 Exam Questions

Exam Code:70-775
Exam Name:Perform Data Engineering on Microsoft Azure HDInsight
Certification Provider:Microsoft
Free Question Number:63
Posted:Sep 04, 2025
Rating
100%

Question 1

You have an Apache Interactive Hive cluster in Azure HDInsight. The cluster has 12 processors and 96 GB of RAM. The YARN container size is set to 2 GB and the Tez container size is 3 GB.
You configure one Tez container per processor.
You are performing map joints between a 2-GB dimension table and a 96-GB fact table.
You experience slow performance due to an inadequate utilization of the available resources.
You need to ensure that the map joins are used.
Which two settings should you configure? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

Question 2

Note: This question is part of a series of questions that use the same or similar answer choices. An answer choice may be correct for more than one question in the series. Each question is independent of the other questions in this series. Information and details provided in a question apply only to that question.
You need to deploy a NoSQL database to an HDInsight cluster. You will manage the server that host the database by using Remote Desktop. The database must use the key/value pair format in a columnar model.
What should you do?

Question 3

You have an Apache Hadoop cluster in Azure HDInsight that has a head node and three data nodes. You have a MapReduce job.
You receive a notification that a data node failed.
You need to identify which component cause the failure.
Which tool should you use?

Question 4

Note: This question is part of a series of questions that use the same scenario. For your convenience, the scenario is repeated in each question. Each question presents a different goal and answer choices, but the text of the scenario is exactly the same in each question in this series.
You are planning a big data infrastructure by using an Apache Spark cluster in Azure HDInsight. The cluster has 24 processor cores and 512 GB of memory.
The architecture of the infrastructure is shown in the exhibit. (Click the Exhibit button.)

The architecture will be used by the following users:
Support analysts who run applications that will use REST to submit Spark jobs.
Business analysts who use JDBC and ODBC client applications from a real-time view. The business analysts run monitoring queries to access aggregate results for 15 minutes. The results will be referenced by subsequent queries.
Data analysts who publish notebooks drawn from batch layer, serving layer, and speed layer queries. All of the notebooks must support native interpreters for data sources that are batch processed. The serving layer queries are written in Apache Hive and must support multiple sessions. Unique GUIDs are used across the data sources, which allow the data analysts to use Spark SQL.
The data sources in the batch layer share a common storage container. The following data sources are used:
Hive for sales data
Apache HBase for operations data
HBase for logistics data by using a single region server
You need to ensure that the support analysts can develop embedded analytics applications by using the least amount of development effort.
Which technology should you implement?

Question 5

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this sections, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have an Apache Pig table named Sales in Apache HCatalog.
You need to make the data in the table accessible from Apache Pig.
Solution: You use the following script.

Does this meet the goal?

Add Comments

Your email address will not be published. Required fields are marked *

insert code
Type the characters from the picture.