0% found this document useful (0 votes)
23 views6 pages

DP 203 Demo

Uploaded by

sai
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views6 pages

DP 203 Demo

Uploaded by

sai
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Questions & Answers PDF Page 1

Version: 21.0

Question: 1

DRAG DROP

You need to ensure that the Twitter feed data can be analyzed in the dedicated SQL pool.
The solution must meet the customer sentiment analytics requirements.
Which three Transaction-SQL DDL commands should you run in sequence? To answer, move
the appropriate commands from the list of commands to the answer area and arrange them in
the correct order.
NOTE: More than one order of answer choices is correct. You will receive credit for any of the
correct orders you select.

Answer:
Explanation:

Scenario: Allow Contoso users to use PolyBase in an Azure Synapse Analytics dedicated SQL pool to
query the content of the data records that host the Twitter feeds. Data must be protected by using
row-level security (RLS). The users must be authenticated by using their own Azure AD credentials.

Box 1: CREATE EXTERNAL DATA SOURCE


External data sources are used to connect to storage accounts.

Box 2: CREATE EXTERNAL FILE FORMAT


CREATE EXTERNAL FILE FORMAT creates an external file format object that defines external data

www.dumpsplanet.com
Questions & Answers PDF Page 3

stored in Azure Blob Storage or Azure Data Lake Storage. Creating an external file format is
a prerequisite for creating an external table.

Box 3: CREATE EXTERNAL TABLE AS SELECT

When used in conjunction with the CREATE TABLE AS SELECT statement, selecting from an
external table imports data into a table within the SQL pool. In addition to the COPY statement,
external tables are useful for loading data.

Reference:
https://docs.microsoft.com/en-us/azure/synapse-analytics/sql/develop-tables-external-tables

Question: 2
HOTSPOT
You need to design a data storage structure for the product sales transactions. The solution
must meet the sales transaction dataset requirements.
What should you include in the solution? To answer, select the appropriate options in the
answer area.
NOTE: Each correct selection is worth one point.

Answer:
Explanation:

Box 1: Hash
Scenario:
Ensure that queries joining and filtering sales transaction records based on product ID complete
as quickly as possible.

A hash distributed table can deliver the highest query performance for joins and aggregations on

www.dumpsplanet.com
Questions & Answers PDF Page 4

large tables.

Box 2: Set the distribution column to the sales date.

Scenario: Partition data that contains sales transaction records. Partitions must be designed to
provide efficient loads by month. Boundary values must belong to the partition on the right.

Reference:
https://rajanieshkaushikk.com/2020/09/09/how-to-choose-right-data-distribution-strategy-for-
azure-synapse/

Question: 3
HOTSPOT
You need to design the partitions for the product sales transactions. The solution must meet the
sales transaction dataset requirements.
What should you include in the solution? To answer, select the appropriate options in the
answer area.
NOTE: Each correct selection is worth one point.

Answer:
Explanation:

www.dumpsplanet.com
Questions & Answers PDF Page 5

Box 1: Sales date


Scenario: Contoso requirements for data integration include:
Partition data that contains sales transaction records. Partitions must be designed to provide
efficient loads by month. Boundary values must belong to the partition on the right.

Box 2: An Azure Synapse Analytics Dedicated SQL pool


Scenario: Contoso requirements for data integration include:
Ensure that data storage costs and performance are predictable.

The size of a dedicated SQL pool (formerly SQL DW) is determined by Data Warehousing
Units (DWU).
Dedicated SQL pool (formerly SQL DW) stores data in relational tables with columnar storage.
This format significantly reduces the data storage costs, and improves query performance.
Synapse analytics dedicated sql pool

Reference:
https://docs.microsoft.com/en-us/azure/synapse-analytics/sql-data-warehouse/sql-data-warehouse-
overview-what-is

Question: 4
HOTSPOT
You need to implement an Azure Synapse Analytics database object for storing the sales
transactions dat

a. The solution must meet the sales transaction dataset requirements.


What solution must meet the sales transaction dataset requirements.
What should you do? To answer, select the appropriate options in the answer area.
NOTE: Each correct selection is worth one point.

www.dumpsplanet.com
Questions & Answers PDF Page 6

Answer:
Explanation:

Box 1: Create table


Scenario: Load the sales transaction dataset to Azure Synapse Analytics

Box 2: RANGE RIGHT FOR VALUES


Scenario: Partition data that contains sales transaction records. Partitions must be designed to
provide efficient loads by month. Boundary values must belong to the partition on the right.

RANGE RIGHT: Specifies the boundary value belongs to the partition on the right (higher values).
FOR VALUES ( boundary_value [,...n] ): Specifies the boundary values for the partition.

Scenario: Load the sales transaction dataset to Azure Synapse Analytics.


Contoso identifies the following requirements for the sales transaction dataset:

Partition data that contains sales transaction records. Partitions must be designed to provide
efficient loads by month. Boundary values must belong to the partition on the right.
Ensure that queries joining and filtering sales transaction records based on product ID complete
as quickly as possible.
Implement a surrogate key to account for changes to the retail store addresses.
Ensure that data storage costs and performance are predictable.
Minimize how long it takes to remove old records.

Reference:
https://docs.microsoft.com/en-us/sql/t-sql/statements/create-table-azure-sql-data-warehouse

Question: 5

www.dumpsplanet.com
Questions & Answers PDF Page 7

You need to integrate the on-premises data sources and Azure Synapse Analytics. The solution
must meet the data integration requirements.
Which type of integration runtime should you use?

A. Azure-SSIS integration runtime


B. self-hosted integration runtime
C. Azure integration runtime

Answer: C
Explanation:

www.dumpsplanet.com

You might also like