2026 New 70-776 Exam Dumps with PDF and VCE Free: https://www.2passeasy.com/dumps/70-776/
we provide Vivid Microsoft 70-776 vce which are the best for clearing 70-776 test, and to get certified by Microsoft Perform Big Data Engineering on Microsoft Cloud Services (beta). The 70-776 Questions & Answers covers all the knowledge points of the real 70-776 exam. Crack your Microsoft 70-776 Exam with latest dumps, guaranteed!
NEW QUESTION 1
You have a Microsoft Azure SQL data warehouse. You have an Azure Data Lake Store that contains data from ORC, RC, Parquet, and delimited text files.
You need to load the data to the data warehouse in the least amount of time possible. Which two actions should you perform? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.
- A. Use Microsoft SQL Server Integration Services (SSIS) to enumerate from the Data Lake Store by using a for loop.
- B. Use AzCopy to export the files from the Data Lake Store to Azure Blob storage.
- C. For each file in the loop, export the data to Parallel Data Warehouse by using a Microsoft SQL Server Native Client destination.
- D. Load the data by executing the CREATE TABLE AS SELECT statement.
- E. Use bcp to import the files.
- F. In the data warehouse, configure external tables and external file formats that correspond to the Data Lake Store.
Answer: DF
Explanation:
References:
https://docs.microsoft.com/en-us/azure/sql-data-warehouse/sql-data-warehouse-load-from-azure-data-lake-store
NEW QUESTION 2
DRAG DROP
You are designing a Microsoft Azure analytics solution. The solution requires that data be copied from Azure Blob storage to Azure Data Lake Store.
The data will be copied on a recurring schedule. Occasionally, the data will be copied manually. You need to recommend a solution to copy the data.
Which tools should you include in the recommendation? To answer, drag the appropriate tools to the correct requirements. Each tool may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
- A. Mastered
- B. Not Mastered
Answer: A
Explanation: 
NEW QUESTION 3
Note: This question is part of a series of questions that present the same scenario. Each question in
the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have a table named Table1 that contains 3 billion rows. Table1 contains data from the last 36 months.
At the end of every month, the oldest month of data is removed based on a column named DateTime.
You need to minimize how long it takes to remove the oldest month of data. Solution: You specify DateTime as the hash distribution column.
Does this meet the goal?
- A. Yes
- B. No
Answer: B
NEW QUESTION 4
DRAG DROP
You have an Apache Hive database in a Microsoft Azure HDInsight cluster. You create an Azure Data Factory named DF1.
You need to transform the data in the Hive database and to output the data to Azure Blob storage. Which three cmdlets should you run in sequence? To answer, move the appropriate cmdlets from the list of cmdlets to the answer area and arrange them in the correct order.
- A. Mastered
- B. Not Mastered
Answer: A
Explanation:
References:
https://docs.microsoft.com/en-us/powershell/module/azurerm.datafactories/new-azurermdatafactorypipeline?view=azurermps-4.4.0
https://github.com/aelij/azure-content/blob/master/articles/data-factory/data-factory-build-your-first-pipeline-using-powershell.md
NEW QUESTION 5
You plan to use Microsoft Azure Event Hubs to ingest sensor data. You plan to use Azure Stream Analytics to analyze the data in real time and to send the output directly to Azure Data Lake Store.
You need to write events to the Data Lake Store in batches. What should you use?
- A. Apache Storm in Azure HDInsight
- B. Stream Analytics
- C. Microsoft SQL Server Integration Services (SSIS)
- D. the Azure CLI
Answer: B
Explanation:
References:
https://docs.microsoft.com/en-us/azure/data-lake-store/data-lake-store-data-scenarios
NEW QUESTION 6
DRAG DROP
You have an on-premises Microsoft SQL Server instance named Instance1 that contains a database named DB1.
You have a Data Management Gateway named Gateway1.
You plan to create a linked service in Azure Data Factory for DB1.
You need to connect to DB1 by using standard SQL Server Authentication. You must use a username of User1 and a password of P@$$w0rd89.
How should you complete the JSON code? TO answer, drag the appropriate values to the correct targets. Each value may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
- A. Mastered
- B. Not Mastered
Answer: A
Explanation:
References:
https://github.com/uglide/azure-content/blob/master/articles/data-factory/data-factory-move-data-between-onprem-and-cloud.md
NEW QUESTION 7
You have an on-premises data warehouse that uses Microsoft SQL Server 2021. All the data in the
data warehouse comes from text files stored in Azure Blob storage. The text files are imported into the data warehouse by using SQL Server Integration Services (SSIS). The text files are not transformed.
You need to migrate the data to an Azure SQL data warehouse in the least amount of time possible. Which two actions should you perform? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.
- A. Use SSIS to upload the files in Azure Blob storage to tables in the Azure SQL data warehouse.
- B. Execute the CREATE EXTERNAL TABLE AS SELECT statement to export the data.
- C. Use AzCopy to transfer the data from the on-premises data warehouse to Azure SQL data warehouse.
- D. Execute the CREATE TABLE AS SELECT statement to load the data.
- E. Define external tables in the Azure SQL data warehouse that map to the existing files in Azure Blob storage.
Answer: DE
Explanation:
References:
https://docs.microsoft.com/en-us/azure/sql-data-warehouse/sql-data-warehouse-load-from-azure- blob-storage-with-polybase
NEW QUESTION 8
You use Microsoft Azure Data Lake Store as the default storage for an Azure HDInsight cluster.
You establish an SSH connection to the HDInsight cluster.
You need to copy files from the HDInsight cluster to the Data LakeStore. Which command should you use?
- A. AzCopy
- B. hdfs dfs
- C. hadoop fs
- D. AdlCopy
Answer: D
NEW QUESTION 9
DRAG DROP
You are building a data pipeline that uses Microsoft Azure Stream Analytics.
Alerts are generated when the aggregate of data streaming in from devices during a minute-long window matches the values in a rule.
You need to retrieve the following information:
*The event ID
*The device ID
*The application ID that runs the service
Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.
- A. Mastered
- B. Not Mastered
Answer: A
Explanation: 
NEW QUESTION 10
DRAG DROP
You need to create a linked service in Microsoft Azure Data Factory. The linked service must use an Azure Database for MySQL table named Customers.
How should you complete the JSON snippet? To answer, drag the appropriate values to the correct targets. Each value may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
- A. Mastered
- B. Not Mastered
Answer: A
Explanation: 
NEW QUESTION 11
You are using a Microsoft Azure Stream Analytics query language. You are outputting data from an input click stream.
You need to ensure that when you consecutively receive two rows from the same IP address within one minute, only the first row is outputted.
Which functions should you use in the WHERE statement?
- A. Last and HoppingWindow
- B. Last and SlidingWindow
- C. LAG and HoppingWindow
- D. LAG and Duration
Answer: B
NEW QUESTION 12
Note: This question is part of a series of questions that present the same scenario. For your convenience, the scenario is repeated in each question. Each question presents a different goal and answer choices, but the text of the scenario is exactly the same in each question in this series.
Start of repeated scenario
You are migrating an existing on-premises data warehouse named LocalDW to Microsoft Azure. You will use an Azure SQL data warehouse named AzureDW for data storage and an Azure Data Factory named AzureDF for extract, transformation, and load (ETL) functions.
For each table in LocalDW, you create a table in AzureDW.
On the on-premises network, you have a Data Management Gateway.
Some source data is stored in Azure Blob storage. Some source data is stored on an on-premises Microsoft SQL Server instance. The instance has a table named Table1.
After data is processed by using AzureDF, the data must be archived and accessible forever. The archived data must meet a Service Level Agreement (SLA) for availability of 99 percent. If an Azure region fails, the archived data must be available for reading always. The storage solution for the archived data must minimize costs.
End of repeated scenario.
You need to define the schema of Table1 in AzureDF. What should you create?
- A. a gateway
- B. a linked service
- C. a dataset
- D. a pipeline
Answer: C
NEW QUESTION 13
HOTSPOT
Note: This question is part of a series of questions that present the same scenario. For your convenience, the scenario is repeated in each question. Each question presents a different goal and answer choices, but the text of the scenario is exactly the same in each question in this series.
Start of repeated scenario
You are migrating an existing on-premises data warehouse named LocalDW to Microsoft Azure. You will use an Azure SQL data warehouse named AzureDW for data storage and an Azure Data Factory named AzureDF for extract, transformation, and load (ETL) functions.
For each table in LocalDW, you create a table in AzureDW.
On the on-premises network, you have a Data Management Gateway.
Some source data is stored in Azure Blob storage. Some source data is stored on an on-premises Microsoft SQL Server instance. The instance has a table named Table1.
After data is processed by using AzureDF, the data must be archived and accessible forever. The archived data must meet a Service Level Agreement (SLA) for availability of 99 percent. If an Azure region fails, the archived data must be available for reading always. The storage solution for the archived data must minimize costs.
End of repeated scenario.
How should you configure the storage to archive the source data? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
- A. Mastered
- B. Not Mastered
Answer: A
Explanation:
References:
https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-storage-tiers
NEW QUESTION 14
You have a Microsoft Azure SQL data warehouse that has a fact table named FactOrder. FactOrder contains three columns named CustomerId, OrderId, and OrderDateKey. FactOrder is hash distributed on CustomerId. OrderId is the unique identifier for FactOrder. FactOrder contains 3 million rows.
Orders are distributed evenly among different customers from a table named dimCustomers that contains 2 million rows.
You often run queries that join FactOrder and dimCustomers by selecting and grouping by the OrderDateKey column.
You add 7 million rows to FactOrder. Most of the new records have a more recent OrderDateKey value than the previous records.
You need to reduce the execution time of queries that group on OrderDateKey and that join dimCustomers and FactOrder.
What should you do?
- A. Change the distribution for the FactOrder table to round robin.
- B. Update the statistics for the OrderDateKey column.
- C. Change the distribution for the FactOrder table to be based on OrderId.
- D. Change the distribution for the dimCustomers table to OrderDateKey.
Answer: B
Explanation:
References:
https://docs.microsoft.com/en-us/azure/sql-data-warehouse/sql-data-warehouse-tables-statistics
NEW QUESTION 15
You plan to capture the output from a group of 500 IoT devices that produce approximately 10 GB of
data per hour by using Microsoft Azure Stream Analytics. The data will be retained for one year.
Once the data is processed, it will be stored in Azure, and then analyzed by using an Azure HDInsight cluster.
You need to select where to store the output data from Stream Analytics. The solution must minimize costs.
What should you select?
- A. Azure Table Storage
- B. Azure SQL Database
- C. Azure Blob storage
- D. Azure SQL Data Warehouse
Answer: C
NEW QUESTION 16
You have a Microsoft Azure Data Lake Analytics service. You plan to configure diagnostic logging.
You need to use Microsoft Operations Management Suite (OMS) to monitor the IP addresses that are used to access the Data Lake Store.
What should you do?
- A. Stream the request logs to an event hub.
- B. Send the audit logs to Log Analytics.
- C. Send the request logs to Log Analytics.
- D. Stream the audit logs to an event hub.
Answer: B
Explanation:
References:
https://docs.microsoft.com/en-us/azure/data-lake-analytics/data-lake-analytics-diagnostic-logs https://docs.microsoft.com/en-us/azure/security/azure-log-audit
NEW QUESTION 17
DRAG DROP
You have a Microsoft Azure SQL data warehouse.
Users discover that reports running in the data warehouse take longer than expected to complete. You need to review the duration of the queries and which users are running the queries currently. Which dynamic management view should you review for each requirement? To answer, drag the appropriate dynamic management views to the correct requirements. Each dynamic management view may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
- A. Mastered
- B. Not Mastered
Answer: A
Explanation:
References:
https://docs.microsoft.com/en-us/sql/relational-databases/system-dynamic-management-views/sys-dm-pdw-exec-requests-transact-sql
https://docs.microsoft.com/en-us/sql/relational-databases/system-dynamic-management-views/sys-dm-pdw-exec-sessions-transact-sql
NEW QUESTION 18
DRAG DROP
You have a Microsoft Azure SQL data warehouse.
You plan to reference data from Azure Blob storage. The data is stored in the GZIP compressed format. The blob storage requires authentication.
You create a master key for the data warehouse and a database schema.
You need to reference the data without importing the data to the data warehouse.
Which four statements should you execute in sequence? To answer, move the appropriate statements from the list of statements to the answer area and arrange them in the correct order.
- A. Mastered
- B. Not Mastered
Answer: A
Explanation: 
NEW QUESTION 19
HOTSPOT
You need to create a Microsoft Azure SQL data warehouse named dw1 that supports up to 10 TB of data. How should you complete the statement? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.
- A. Mastered
- B. Not Mastered
Answer: A
Explanation: 
NEW QUESTION 20
You plan to use Microsoft Azure Data factory to copy data daily from an Azure SQL data warehouse to an Azure Data Lake Store.
You need to define a linked service for the Data Lake Store. The solution must prevent the access token from expiring.
Which type of authentication should you use?
- A. OAuth
- B. service-to-service
- C. Basic
- D. service principal
Answer: D
Explanation:
References:
https://docs.microsoft.com/en-gb/azure/data-factory/v1/data-factory-azure-datalake- connector#azure-data-lake-store-linked-service-properties
NEW QUESTION 21
You have a file in a Microsoft Azure Data Lake Store that contains sales data. The file contains sales amounts by salesperson, by city, and by state.
You need to use U-SQL to calculate the percentage of sales that each city has for its respective state. Which code should you use?

- A. Option A
- B. Option B
- C. Option C
- D. Option D
Answer: A
NEW QUESTION 22
......
P.S. Easily pass 70-776 Exam with 91 Q&As 2passeasy Dumps & pdf Version, Welcome to Download the Newest 2passeasy 70-776 Dumps: https://www.2passeasy.com/dumps/70-776/ (91 New Questions)