We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF or read online on Scribd
You are on page 1/ 55
ih
Home » Meas
ETL Testing Multiple-Choice Questions (MCQs)
ETL tools extract the data from all the different data sources, transforms the data and and
loads it into a data warehouse.
ETL Testing MCQs: This section contains multiple-choice questions and answers on the various
topics of ETL Testing. Practice these MCQs to test and enhance your skills on ETL Testing,
List of ETL Testing MCQs
1. Using an __ tool, data is extracted from multiple data sources, transformed, and
loaded into a data warehouse after joining fields, calculating, and removing incorrect
data fields.
AETL
B. TEL
C. LET
D. LTE
Answer: A) ETL
Explanation:
Using an ETL tool, data is extracted from multiple data sources, transformed, and loaded into a
data warehouse after joining fields, calculating, and removing incorrect data fields
7
Discuss this Question )
2. After business __, ETL testing ensures that the data has been loaded accurately from a
source to a destination.
A. Information
B. Transformation
C. Transfusion
D. TransfictionAnswer: B) Transformation
Explanation:
After business transformation, ETL testing ensures that the data has been loaded accurately
from a source to a destination.
Discuss this Question )
3. During ETL, various stages of data are verified and used at__.
A. Source
B. Destination
C. Both A and B
D. None of the above
Answer: C) Both A and B
Explanation:
During ETL, various stages of data are verified and used both at source and destination.
Discuss this Question )
4, What is the full form of ETL?
A. Extract Transformation and Load
B. Extract Transformation and Lead
C. Extract Transfusion and Load
D. Extract Transfusion and Lead
Answer: A) Extract Transformation and Load
Explanation:
The full form of ETL is Extract Transformation and Load.
Discuss this Question )
X /5. To fetch data from one database and place it in another, ETL combines all__ database
functions into one tool.
A. Two
B. Three
C. Four
D. Five
Answer: B) Three
Explanation:
To fetch data from one database and place it in another, ETL combines all three database
functions into one tool.
Discuss this Question )
6. Getting information from a database is called __ (reading).
A. Extracting
B, Transforming
C. Loading
D. None
Answer: A) Extracting
Explanation:
Getting information from a database is called extracting (reading)
Discuss this Question )
Ne J
7. The process of ___ data involves converting it from one form to another.
A. Extracting
B. Transforming
C. Loading
D. None
Answer: B) TransformingExplanation:
The process of transforming data involves converting it from one form to another.
Discuss this Question )
\
8. In addition to using __, the data can be combined with other data to undergo
transformati
A. Rules
B. Lookup tables
C. Both A and B
D. None of the above
Answer: C) Both A and B
Explanation:
In addition to using rules or lookup tables, the data can be combined with other data to
undergo transformation.
iscuss this Question )
9. Writing data into a database is called.
A. Extracting
B. Transforming
C. Loading
D. None
Answer: C) Loading
Explanation:
Writing data into a database is called loading
~
Discuss this Question )
)10. Using ETL, you can___ the data from multiple sources and blend them together
according to your needs.
A. Extract
B. Transform
C. Load
D. All of the above
Answer: D) All of the above
Explanation:
Using ETL, you can extract, transform, and load the data from multiple sources and blend them
together according to your needs.
Discuss this Question )
11. ETL is often used to build a -
A. Data Center
B. Data Warehouse
C. Data Set
D. Data Care Center
Answer: B) Data Warehouse
Explanation:
ETLis often used to build a Data Warehouse.
Discuss this Question )
Ne J
12. As part of the ETL process, data from a source system is___ into a format suitable for
storing in a data warehouse or another system.
A. Extracted
B. Converted
C. Both A and B
D. None of the aboveAnswer: C) Both A and B
Explanation:
As part of the ETL process, data from a source system is extracted and converted into a format
suitable for storing in a data warehouse or another system
Discuss this Question )
13. In today's environment, ETL is becoming more and more necessary for many reasons,
including:
A. In order to make critical business decisions, companies use ETL to analyze their business
data,
B. Data warehouses are repositories where data is shared.
C. In ETL, data is moved from a variety of sources into a data warehouse.
D. All of the above
Answer: D) All of the above
Explanation:
In today's environment, ETL is becoming more and more necessary for many reasons, including:
i. In order to make critical business decisions, companies use ETL to analyze their business
data
ii. Data warehouses are repositories where data is shared.
ili, In ETL, data is moved from a variety of sources into a data warehouse.
Discuss this Question )
)
14. What is TRUE about ETL?
A. The ETL process allows the source and target systems to compare sample data.
B. As part of the ETL process, complex transformations can be performed and additional
storage space is required.
C. Earlier, we defined ETL as a process of converting source data into target data and
manipulating it
D. All of the aboveAnswer: D) All of the above
Explanation:
The things that are TRUE about ETL are -
i. The ETL process allows the source and target systems to compare sample data
ii. As part of the ETL process, complex transformations can be performed and additional
storage space is required.
iii, Earlier, we defined ETL as a process of converting source data into target data and
manipulating it.
cr )
Discuss this Question }
15.A
is loaded with data through the ETL process.
A. Data Mart
B. Data Warehouse
C. Both A and B
D. None of the above
Answer: C) Both A and B
Explanation:
A data mart or data warehouse is loaded with data through the ETL process
Discuss this Question )
\ /
16. To facilitate business analysis, our data warehouse needs to be __ regularly
A. Extracted
B. Transformed
C. Loaded
D. None
Answer: C) Loaded
Explanation:To facilitate business analysis, our data warehouse needs to be loaded regularly.
Discuss this Question )
17. To avoid affecting the source system's performance, the__ occurs on the ETL server
or staging area.
A. Loading
B. Extraction
C. Transformation
D. None
Answer: C) Transformation
Explanation:
To avoid affecting the source system's performance, the transformation occurs on the ETL
server or staging area.
18. Before data is moved to the warehouse, the extracted data can be validated in the _
area.
A Staging
B. Staggering
C. Studying
D. None
Answer: A) Staging
Explanatio!
Before data is moved to the warehouse, the extracted data can be validated in the staging area
7
Discuss this Question )
)
19. How many methods are there to extract the data?A. Two
B. Three
C. Four
D. Five
Answer: B) Three
Explanation:
There are three methods to extract the data.
Discuss this Question )
\ /
20. Which of the following is/are the method(s) to extract the data?
A. FULL Extraction
B. Partial Extraction - Without Update Notification
C. Partial Extraction - With Update Notification
D. All of the above
Answer: D) All of the above
Explanatior
The following are the methods to extract the data -
i, FULL Extraction
ii, Partial Extraction - Without Update Notification
iii, Partial Extraction - With Update Notification
Discuss this Question )
we v
21. Whatever extraction method we use, the source system should not be affected in
terms of __ time.
A. Performance
B. Response
C. Both A and B
D. None of the aboveAnswer: C) Both A and B
Explanation:
Whatever extraction method we use, the source system should not be affected in terms of
performance or response time.
Discuss this Question )
22. Which of the following is/are the validation(s) using the extraction(s)?
A. Check the source data against the record
B. Ensure that the data type is correct
C. There will be a check to see if all the keys are there
D. All of the above
Answer: D) All of the above
Explanation:
The following are the validations using the extractions -
i. Check the source data against the record
ii, Ensure that the data type is correct
iii, There will be a check to see if all the keys are there
Discuss this Question )
23. It is not possible to use the extracted data as originally formatted from the __ server.
A. Source
B. Site
C. Storage
D. None
Answer: A) Source
Explanation:
It is not possible to use the extracted data as originally formatted from the source server.>
Discuss this Question )
\ /
24.____ refers to data that does not need to be transformed.
A. Direct Move
B. Pass-through data
C. Both A and B
D. None of the above
Answer: C) Both A and B
Explanation:
Direct move or pass-through data refers to data that does not need to be transformed
Discuss this Question )
25. Which of the following is/are the validation point(s) during the transformation?
A. Filtering
B. Conversion of character sets and encodings
C. Checking the threshold and validity of data
D. All of the above
Answer: D) All of the above
Explanatio!
The following are the validation points during the transformation -
i Filtering
ii, Conversion of character sets and encodings
ili, Checking the threshold and validity of data
Discuss this Question )
26. Which of the following is the last step of the ETL process?
A. ExtractionB, Transformation
C. Loading
D. None
Answer: C) Loading
Explanation:
The following is last step of the ETL process is Loading.
iscuss this Question }
Qe
27. Which of the following is/are the type(s) of loading?
A. Initial Load
B. Incremental Load
C. Full Refresh
D. All of the above
Answer: D) All of the above
Explanation:
The following are the types of loading -
i. Initial Load
ii, Incremental Load
iii. Full Refresh
7 \
Discuss this Question )
28. Loads need to be __ according to server performance by the admin of the data
warehouse.
A. Monitored
B. Resumed
C. Canceled
D. All of the above
Answer: D) All of the aboveExplanation:
Loads need to be monitored, resumed, and canceled according to server performance by the
admin of the data warehouse.
iscuss this Question )
29.Witha__,
fables are erased and reloaded with new information.
A. Initial Load
B. Incremental Load
C. Full Refresh
D. None of the above
Answer: C) Full Refresh
Explanation:
With a Full Refresh, all tables are erased and reloaded with new information.
Discuss this Question )
S
30. The term is ETL now extended to ____ or Extract, Monitor, Profile, Analyze, Cleanse,
Transform, and Load.
A. E-MPAC-TL
B. E-PAC-TL
C. E-MAP-TL
D. E-MPAA-TL.
Answer: A) E-MPAC-TL
Explanation:
The term is now extended to E-MPAC-TL or Extract, Monitor, Profile, Analyze, Cleanse,
Transform, and Load.
7 \
Discuss this Question31. During
while mi
the main goal is to capture data as quickly as possible from a system
9 the inconvenience to the system.
A. Extraction
B. Transformation
C. Loading
D. None
Answer: A) Extraction
Explanation:
During extraction, the main goal is to capture data as quickly as possible from a system while
minimizing the inconvenience to the system.
Discuss this Question )
32.___ the data requires integrating the data and finally presenting the combined data
to the end-user community via the front-end tools.
1. Transforming
2. Loading
3. Both A and B
4, None of the above
Answer: C) Both A and B
Explanation:
Transforming and loading the data requires integrating the data and finally presenting the
combined data to the end-user community via the front-end tools
Discuss this Question )
\ )
33. Compared to traditional ETL, ETL ___ the time it takes for sources and targets to
develop.
A. Extends
B. Reduces
C. ExceedsD. Manipulates
Answer: B) Reduces
Explanation:
Compared to traditional ETL, ETL reduces the time it takes for sources and targets to develop.
Discuss this Question )
\
34,___ consists of a process between stages that is defined according to the needs, and it
can verify the quality of the product
A. Quantity Assurance
B. Quality Assurance
C. Quantity Attribution
D. Quality Attribution
Answer: B) Quality Assurance
Explanation:
Quality Assurance consists of a process between stages that is defined according to the needs,
and it can verify the quality of the product.
Discuss this Question )
35. In__, analysis and validation of the data pattern and formats will be performed, as
well as identification and validation of redundant data across data sources to determine
the actual content, structure, and quality of the data.
A. Data Profiling
B. Data Analysis
C. Source Analysis
D. Cleansing
Answer: A) Data Profiling
Explanation:In data profiling, analysis and validation of the data pattern and formats will be performed, as
well as identification and validation of redundant data across data sources to determine the
actual content, structure, and quality of the data.
Discuss this Question )
36. It is important to focus on the surroundings of the sources in__ analysis, so that the
documentation of the sources can be obtained.
A. Data
B. Source
C. Profile
D. None
Answer: B) Source
Explanation:
Itis important to focus on the surroundings of the sources in source analysis, so that the
documentation of the sources can be obtained
iscuss this Question }
\
37. Based on the Metadata of a set of predefined rules, errors found can be fixed in.
A. Data Analysis
B, Source Analysis,
C. Cleansing
D. Data Profiling
Answer: C) Cleansing
Explanation:
Based on the Metadata of a set of predefined rules, errors found can be fixed in cleansing.
r )
Discuss this Question38. An extended ETL concept, E-MPAC-TL is designed to meet the requirements while
taking into account the realities of the systems, __, constraints, and most importantly,
the data itself.
A. Tools
B. Metadata
C. Technical Issues
D. All of the above
Answer: D) All of the above
Explanation:
An extended ETL concept, E-MPAC-TL is designed to meet the requirements while taking into
account the realities of the systems, tools, metadata, technical issues, constraints, and most
importantly, the data itself.
Discuss this Question )
39. ETL testing is also known as -
A. Table balancing
B. Product Reconciliation
C. Both A and B
D. None of the above
Answer: C) Both A and B
Explanation:
ETL testing is also known as Table balancing or Product Reconciliation
Discuss this Question )
\ )
40. An ETL test ensures the data loaded after transformation is accurate after it has been
___ from a source to a destination.
A. Added
B. Loaded
C. DeletedD. Altered
Answer: B) Loaded
Explanation:
An ETL test ensures the data loaded after transformation is accurate after it has been loaded
from a source to a destination.
Discuss this Question )
\ /
41. ETL testing is performed in ___ stages.
A. Two
B. Three
C. Four
D. Five
Answer: D) Five
Explanation:
ETL testing is performed in five stages.
(Discuss this Question )
ne
42. Which of the following is/are the stage(s) the ETL testing?
A. Data recovery
B. Build and populate data
C. Build reports
D. All of the above
Answer: D) All of the above
Explanation:
The following are the stages of the ETL testing -
i. Data recovery
ii, Build and populate dataili, Build reports
Discuss this Question )
43. Which of the following is/are the type(s) of ETL testing?
‘A. New Data Warehouse Testing
B, Production Validation Testing
C. Application Upgrade
D. All of the above
Answer: D) All of the above
Explanation:
The following are the types of ETL testing -
i. New Data Warehouse Testing
ii, Production Validation Testing
ili, Application Upgrade
Discuss this Question )
44, Customer requirements and different sources of data are taken into account in__.
A. New Data Warehouse Testing
B. Production Validation Testing
C. Application Upgrade
D. Metadata Testing
Answer: A) New Data Warehouse Testing
Explanation:
Customer requirements and different sources of data are taken into account in New Data
Warehouse Testing
\
Discuss this Question )45. Which of the following is/are the group that plays the respon:
Data Warehouses -
ility in testing New
A. Business Analyst
B. Infrastructure People
C.QA Testers
D. All of the above
Answer: D) All of the above
Explanation:
The following are the group that plays responsibility in testing New Data Warehouses -
i. Business Analyst
ii Infrastructure People
ili, QA Testers
Discuss this Question )
/
46. What is the responsibility of a Business Analyst?
A. Requirements are gathered and documented by the business analyst.
B. The test environment is set up by Business Analyst people.
C. These plans and scripts are developed by Business Analysts and then executed by them
D. Each module is tested by a Business Analyst.
Answer: A) Requirements are gathered and documented by the business analyst.
Explanation:
Requirements are gathered and documented by the business analyst.
Discuss this Question )
\ /
47. The __ develops test plans and scripts and executes these plans and scripts.
A. Infrastructure People
B. QA Testers
C. DevelopersD. Users
Answer: B) QA Testers
Explanation:
The QA tester develops test plans and scripts and executes these plans and scripts
a.
Discuss this Question )
enn
48. Performance and stress tests are conducted by__.
A. Infrastructure People
B. Developers
C. Users
D, Database Administrators
Answer: D) Database Administrators
Explanation:
Performance and stress tests are conducted by Database Administrators.
Discuss this Question )
49. What is the full form of UAT?
A. User Analyst Testing
B. User Acceptance Testing
C. User Attribute Testing
D. None
‘Answer: B) User Acceptance Testing
Explanation:
The full form of UAT is User Acceptance Testing.
Discuss this Question )
XM /50. Whenever data is moved into production systems, __ tests are performed.
A. Production Validation
B. Source to Target
C. Metadata
D. Data Accuracy
Answer: A) Production Validation
Explanation:
Whenever data is moved into production systems, production validation tests are performed.
Discuss this Question }
51. To ensure that the data don't compromise production systems, __ automates ETL
testing and management.
A. Informatica Data Validation
B. Irrelevant Data Validation
C. Informatica Duration Validation
D. Irrelevant Duration Validation
Answer: A) Informatica Data Validation
Explanation:
To ensure that the data don't compromise production systems, Informatica Data Validation
automates ETL testing and management.
Discuss this Question )
v
52. Validating the data values transformed to the expected data values is done through
___ testing.
A. Source to target
B. Metadata
C. Data Accuracy
D. Data TransformationAnswer: A) Source to target
Explanation:
Validating the data values transformed to the expected data values is done through source-to-
target testing.
Discuss this Question )
53. Tests are automatically generated for which saves test developers’ time.
A. Data Accuracy
B. Data Transformation
C. Application Upgrades
D. Data Quality
Answer: C) Application Upgrades
Explanation:
Tests are automatically generated for Application Upgrades, which saves test developers’ time.
Discuss this Question }
54. When an applicatio1
is upgraded, the extracted data from the old application is
checked against the new application's data to ensure that they are__.
A. Different
B. Identical
C. Similar
D. Varied
Answer: B) Identical
Explanatio!
When an application is upgraded, the extracted data from the old application is checked
against the new application's data to ensure that they are identical
Discuss this Question )
\ )55. As part of ___ testing, types of data, lengths of data, and indexes and constraints are
measured.
A. Metadata
B. Data Accuracy
C. Data Transformation
D. Data Quality
Answer: A) Metadata
Explanation:
As part of metadata testing, types of data, lengths of data, and indexes and constraints are
measured.
Discuss this Question )
/
56. We test the data__ to ensure that data loading and transformation is accurate.
A. Accuracy
B. Transformation
C. Quality
D. None
Answer: A) Accuracy
Explanation:
We test the data accuracy to ensure that data loading and transformation is accurate.
——>
Discuss this Question )
57. Which of the following testing(s) is/are included in Data Quality Testing?
A. Syntax
B. Reference
C. Both A and B
D. None of the aboveAnswer: C) Both A and B
Explanation:
The following testings are included in Data Quality Testing -
i. Syntax
ii, Reference
Discuss this Question )
\ /
58. Invalid characters, invalid character patterns, or improper upper- or lower-case order
will result in dirty data being reported by __ tests.
A. Syntax
B. Reference
C. Accuracy
D. Transformation
Answer: A) Syntax
Explanation:
Invalid characters, invalid character patterns, or improper upper- or lower-case order will result
in dirty data being reported by syntax tests.
Discuss this Question )
59. A data integrity test is conducted for __ testing when new data is added to old data.
A. Incremental ETL
B, GUI/Navigation
C. Migration
D. Report
Answer: A) Incremental ETL
Explanation:A data integrity test is conducted for incremental ETL testing when new data is added to old
data.
Discuss this Question )
\ )
60. After data has been inserted and updated during an incremental ETL process,
incremental testing verifies the system is___ properly.
‘A. Deadlocked
B. Still Working
C. Crashed
D. Initiated
Answer: B) Still Working
Explanation:
After data has been inserted and updated during an incremental ETL process, incremental
testing verifies the system is still working properly.
Discuss this Question )
61. __ reports are tested for navigation and GUI aspects by GUI/Navigation Testing.
A. Front-end
B. Back-end
C. Both A and B
D. None of the above
Answer: A) Front-end
Explanatio!
Front-end reports are tested for navigation and GUI aspects by GUI/Navigation Testing
7
Discuss this Question )
\ Jing data warehouse is used in__ Testing, and ETL is used to process the
A. Migration
B. Report
C. Incremental ETL
D. GUI
Answer: A) Migration
Explanation:
An existing data warehouse is used in Migration Testing, and ETL is used to process the data.
Discuss this Question )
/
63. Which of the following steps is/are included in Migration Testing?
A. Design and validation tests
B. Setting up the test environment
C. Executing the validation test
D. All of the above
Answer: D) All of the above
Explanation:
The following steps are included in Migration Testing -
i, Design and validation tests
ii, Setting up the test environment
ili, Executing the validation test
(Discuss this Question )
\ )
64.___ validation should be done for reports.
A. Data
B. Layout
C. Both A and BD. None of the above
Answer: C) Both A and B
Explanation:
Data validation and layout validation should be done for reports.
=
Discuss this Question )
\
65. Which of the following task(s) is/are performed in ETL testing?
A. The ability to understand and report on data
B. Source-to-target mapping
C. Analyzes the source data for errors
D. All of the above
Answer: D) All of the above
Explanation:
The following tasks are performed in ETL testing -
i. The ability to understand and report on data
ii, Source-to-target mapping
ili, Analyzes the source data for errors
7 \
Discuss this Question
66. __ testing is typically performed on transactional systems, whereas __ testing is
typically performed on data in a data warehouse.
A. Database, ETL
B. ETL, Database
C. Database, ELT
D. ELT, Database
Answer: A) Database, ETL
Explanation:Database testing is typically performed on transactional systems, whereas ETL testing is
typically performed on data in a data warehouse.
Discuss this Question )
\ )
67. The following operations are involved in ETL testing:
A. Data movement validation between the source and target systems,
B. The source and target systems should be verified for data counts.
C. During ETL testing, the transformations and extractions are verified according to
requirements.
D. All of the above
Answer: D) All of the above
Explanation:
The following operations are involved in ETL testing:
i. Data movement validation between the source and target systems.
ii, The source and target systems should be verified for data counts.
ili, During ETL testing, the transformations and extractions are verified according to
requirements.
iscuss this Question }
eee)
68. During ETL testing, __are verified to ensure that they are preserved.
A. Joins
B. Keys
C. Both A and B
D. None of the above
Answer: C) Both A and B
Explanation:
During ETL testing, joins and keys are verified to ensure that they are preservedcr \
Discuss this Question )
\ /
69. In database testing, the focus is on ensuring that data is__.
A. Accurate
B. Correct
C. Valid
D. All of the above
Answer: D) All of the above
Explanation:
In database testing, the focus is on ensuring that data is accurate, correct, and valid.
Discuss this Question )
70. The following operations are performed during database testing:
‘A. During database testing, data values in columns are verified to ensure they are valid.
B. Tests are conducted on databases to determine whether primary or foreign keys are
maintained.
C Testing the database verifies if the column has any missing data.
D. All of the above
Answer: D) All of the above
Explanation:
The following operations are performed during database testing:
i, During database testing, data values in columns are verified to ensure they are valid.
ii. Tests are conducted on databases to determine whether primary or foreign keys are
maintained.
ili, Testing the database verifies if the column has any missing data.
ST
Discuss this Question }71. A performance test is conducted to determine if ETL systems can handle__ at the
same time.
A. Multiple Users
B. Transactions
C. Both A and B
D. None of the above
Answer: C) Both A and B
Explanation:
A performance test is conducted to determine if ETL systems can handle multiple users and
transactions at the same time.
Discuss this Question )
72. A__ compares the data between a source system and a target system without
transforming the data in either system.
A. Value Comparison
B. Value Compression
C. Value Compromise
D. Value Contraction
Answer: A) Value Comparison
Explanation:
Avalue comparison compares the data between a source system and a target system without
transforming the data in either system.
Discuss this Question )
\ )
73. The data accuracy of both the source and the target can be checked with a set of __
operators.
A. Relational
B. Rational
c. SQLD. Database
Answer: C) SQL
Explanation:
The data accuracy of both the source and the target can be checked with a set of SQL
operators.
Discuss this Question )
\ /
74, __ the distinct values of critical data columns in the source and target systems is a
good way to verify the integrity of critical data columns.
A. Examining
B, Comparing
C. Differentiating
D. None of the above
Answer: B) Comparing
Explanation:
Comparing the distinct values of critical data columns in the source and target systems is a
good way to verify the integrity of critical data columns.
Discuss this Question )
75. A single SQL query cannot convert data because __.
A It can compare the output with the target.
B. It can't compare the output with the target.
C. It can’t compare the input with the target.
D. It can compare the input with the target.
Answer: B) It can't compare the output with the target.
Explanation:
Asingle SQL query cannot convert data because it can't compare the output with the target.>
Discuss this Question )
\ /
76, Data Transformation ETL testing involves writing SQL queries for each row in
order to confirm the rules of the transformation.
A. Two
B. Three
C. Four
D. Multiple
Answer: D) Multiple
Explanation:
Data Transformation ETL testing involves writing multiple SQL queries for each row in order to
confirm the rules of the transformation.
Discuss this Question )
eee
77. It is imperative that we select sufficient and representative data from the source
system to perform the successful testing.
A. Database
B. ETL
C. Both A and B
D. None of the above
Answer: B) ETL
Explanation:
Itis imperative that we select sufficient and representative data from the source system to
perform successful ETL testing.
Discuss this Question )
/
78. The document(s) that the ETL tester always uses during the testing process is/are:‘A. ETL Mapping Sheets
B. DB Schema of Source (Target)
C. Both A and B
D. None of the above
Answer: C) Both A and B
Explanation:
The document(s) that the ETL tester always uses during the testing process is/are
i. ETL Mapping Sheets
ii, DB Schema of Source (Target)
Discuss this Question )
S
79. A___ contains all the columns and their lookups in reference tables for both source
and destination tables.
A. Mapping Sheet
B, DB Schema of Source
C. DB Schema of Target
D. None of the above
Answer: A) Mapping Sheet
Explanation:
A mapping sheet contains all the columns and their lookups in reference tables for both source
and destination tables.
Discuss this Question )
80. Which of the following is/are the type(s) of ETL Bugs?
A. Calculation
B. User Interface
C. Load Condition
D. All of the aboveAnswer: D) All of the above
Explanation:
The following are the types of ETL Bugs -
i. Calculation
ii, User Interface
iii. Load Condition
iscuss this Question }
Qe
81. __, spelling check, and other issues related to the Graphical User Interface of an
application are examples of User Interface bugs.
A. Color
B. Font Style
C. Navigation
D. All of the above
Answer: D) All of the above
Explanatio!
Color, font style, navigation, spelling check, and other issues related to the Graphical User
Interface of an application are examples of User Interface bugs.
Discuss this Question )
Oe)
82. Asa result of the___ bug,
values are being rejected.
valid values are being taken by the application and valid
A. Input-output
B. Boundary value analysis
C. Calculation
D. Load Condition
Answer: A) input-output
Explanatio!As a result of the input-output bug, invalid values are being taken by the application and valid
values are being rejected,
Discuss this Question )
\ )
83. Bugs that check for minimums and maximums are called __ bugs.
A Calculation
B. Load Condition
C. Boundary value analysis
D. Race Condition
Answer: C) Boundary value analysis
Explanation:
Bugs that check for minimums and maximums are called boundary value analysis bugs.
Discuss this Question )
Qe
84, Mathematical errors show up in__ bugs, and the results are usually inaccurate.
A. Load Condition
B. Race Condition
C. Hardware
D. Calculation
Answer: D) Calculation
Explanation:
Mathematical errors show up in calculation bugs, and the results are usually inaccurate
Discuss this Question )
\ )
85. Invalid or invalid types are produced by __ bugs.
A. Load Condition
B. Race ConditionC. Equivalence Class Partitioning
D. Version Control
Answer: C) Equivalence Class Partitioning
Explanation:
Invalid or invalid types are produced by Equivalence Class Partitioning bugs.
Discuss this Question )
\ /
86. Regression Testing bugs do not indicate what version they came from, as they are
usually caused by___ bugs.
A. Help Source
B. Hardware
C. Version Control
D. Load Condition
Answer: C) Version Control
Explanation:
Regression Testing bugs do not indicate what version they came from, as they are usually
caused by Version Control bugs.
Discuss this Question )
87. It is the ETL tester's responsibility to validate the __ and extract the data from the
target table.
A. Data Sources
B. Apply Transformation
C. Load the data into the target table
D. All of the above
Answer: D) All of the above
Explanation:It is the ETL tester's responsibility to validate the data sources, apply transformation logic, load
the data into the target table, and extract the data from the target table.
Discuss this Question )
)
88. ETL testers has/have the following responsibility,
A. In the source system, verify the table.
B. Apply Transformation Logic
C. Data Loading
D. All of the above
Answer: D) All of the above
Explanation:
ETL testers have the following responsibilities:
i. In the source system, verify the table
ii. Apply Transformation Logic
iii, Data Loading
Discuss this Question )
89. Following is/are the type(s) of operations involved in verifying the table in the source
system:
A. Count Check
B. Data Type Check
C. Remove Duplicate Data
D. All of the above
Answer: D) All of the above
Explanation:
Following are the types of operations involved in verifying the table in the source system:
i. Count Check
ii. Data Type Checkiii, Remove Duplicate Data
Discuss this Question )
90. What is/are the advantage(s) of ETL Testing?
‘A. During ETL testing, data can be extracted from or received from any data source
simultaneously.
B. In ETL, heterogeneous data sources can be loaded into a single generalized
(frequent)/different target simultaneously.
G Its possible to load different types of goals simultaneously using ETL.
D. All of the above
Answer: D) All of the above
Explanation:
The advantages of ETL Testing are -
i, During ETL testing, data can be extracted from or received from any data source
simultaneously.
ii. In ETL, heterogeneous data sources can be loaded into a single generalized
(frequent)/different target simultaneously.
ii, Itis possible to load different types of goals simultaneously using ETL.
Discuss this Question )
91. An __ procedure is capable of extracting business data from a variety of sources and
loading it into a different target as desired.
AETL
B. Database
C. Both A and B
D. None of the above
Answer: A) ETL
Explanation:‘An ETL procedure is capable of extracting business data from a variety of sources and loading it
into a different target as desired.
Discuss this Question )
\ )
92. What is/are the disadvantages of ETL Testing?
A. ETL testing has the disadvantage of requiring us to be database analysts or developers
with data-oriented experience.
B, On-demand or real-time access is not ideal when we need a fast response.
C. There will be a delay of months before any ETL testing can be done.
D. All of the above
Answer: D) All of the above
Explanation:
The disadvantages of ETL Testing are -
i. ETL testing has the disadvantage of requiring us to be database analysts or developers
with data-oriented experience.
ii. On-demand or real-time access is not ideal when we need a fast response.
ili, There will be a delay of months before any ETL testing can be done.
iscuss this Question }
ana
93. What is/are the requisites provided by ETL tools?
‘A. Multiple data structures and different platforms, such as mainframes, servers, and
databases, can be collected, read, and migrated using ETL tools.
B. Using an ETL tool is as easy as sorting, filtering, reformatting, merging, and joining data.
C.A few ETL tools support BI tools and functionality such as transformation scheduling,
monitoring, and version control.
D. All of the above
Answer: D) All of the above
Explanation:
The requisites provided by ETL tools are -i, Multiple data structures and different platforms, such as mainframes, servers, and
databases, can be collected, read, and migrated using ETL tools.
ii. Using an ETL tool is as easy as sorting, filtering, reformatting, merging, and joining data
ili, A few ETL tools support BI tools and functionality such as transformation scheduling,
monitoring, and version control.
Discuss this Question )
94, What is/are the benefit(s) of ETL tools?
A. Ease of Use
B. Operational Resilience
C. Visual Flow
D. All of the above
Answer: D) All of the above
Explanation:
The benefits of ETL tools are -
i. Ease of Use
ii. Operational Resilience
ii, Visual Flow
Discuss this Question )
95. Data engineers can develop a successful and well-instrumented system with ETL tools
that have ___ error handling.
A. Artificial
B. Built-in
C. Natural
D. None
Answer: B) Built-in
Explanation:Data engineers can develop a successful and well-instrumented system with ETL tools that have
built-in error handling.
Discuss this Question )
\ )
96. An ETL tool simplifies the task of ___and integrating multiple data sets when dealing
with complex rules and transformations.
A.Calculating
B. String Manipulation
C. Changing Data
D. All of the above
Answer: D) All of the above
Explanation:
AETL tool simplifies the task of calculating, string manipulating, changing data, and integrating
multiple data sets when dealing with complex rules and transformations
Discuss this Question )
97.____ can be simplified using ETL tools.
A. Extraction
B. Transformation
C. Loading
D. All of the above
Answer: D) All of the above
Explanatio!
Extraction, transformation, and loading can be simplified using ETL tools.
7
Discuss this Question )
)
98. Which of the following is/are the type(s) of ETL tools?A Talend Data Integration
B. Informatica
C. Kettle
D. All of the above
Answer: D) Alll of the above
Explanation:
The following are the types of ETL tools -
i, Talend Data Integration
ii, Informatica
ili. Kettle
Discuss this Question )
99. Which of the following is a cloud-based tool?
A. Clover ETL
B. AWS Glue
C. Jasper ETL
D. Kettle
Answer: B) AWS Glue
Explanation:
The following is the cloud-based tool - AWS Glue.
Discuss this Question )
v7,
100. ETL tool function isa ___-layered structure.
A. One
B. Two
C. Three
D. Four
Answer: C) ThreeExplanation:
The ETL tool function is a three-layered structure.
Discuss this Question )
\
101. Which of the following is/are the layer(s) in the ETL tool function?
A. Staging Layer
B. Data Integration Layer
C. Access Layer
D. All of the above
Answer: D) All of the above
Explanation:
The following are the layers in the ETL tool function -
i, Staging Layer
ii, Data Integration Layer
ili, Access Layer
ee
Discuss this Question )
Cee a)
102. Data extracted from different sources is storedina___.
A. Staging database
B. Staging layer
C. Both A and B
D. None of the above
Answer: C) Both A and B
Explanatio!
Data extracted from different sources is stored in a staging database or staging layer.
Discuss this Question )103. A database is created from the data transformed by the ___ Layer.
A. Data
B. Staging
C. Integration
D. Access
Answer: C) Integration
Explanation:
A database is created from the data transformed by the Integration Layer.
r .
Discuss this Question }
104, Facts and aggregate facts are grouped into hierarchical groups in the database
referred toas__.
A. Dimensions
B. Data
C. Dataset
D. Deadlock
Answer: A) Dimensions
Explanation:
Facts and aggregate facts are grouped into hierarchical groups in the database referred to as
dimensions.
Discuss this Question )
v
105. Which of the following is TRUE about RightData?
A. An online tool for testing ETL/Data integration, RightData is available as a self-service
program.
B. Data can be validated and coordinated between datasets despite differences in data
models or types of sources with RightData's interface.
C. Data platforms with high complexity and large volumes require RightData to work
efficiently.D. All of the above
Answer: D) All of the above
Explanation:
The following are TRUE about RightData -
i. An online tool for testing ETL/Data integration, RightData is available as a self-service
program.
ii, Data can be validated and coordinated between datasets despite differences in data
models or types of sources with RightData's interface.
ili, Data platforms with high complexity and large volumes require RightData to work
efficiently.
Discuss this Question )
106. __ testing can be done with the QuerySurge tool.
A. Data Warehouse
B. Big Data
C. Both A and B
D. None of the above
Answer: C) Both A and B
Explanation:
Data Warehouse and Big Data testing can be done with the QuerySurge tool.
Discuss this Question }
S )
107. Which of the following is/are the feature(s) of QuerySurge?
A. Abig data testing and ETL testing tool, QuerySurge automates the testing process.
B. It automates the manual process and schedules tests for a specific date and time.
. Using this tool, you can create test scenarios and test suits along with configurable
reports without knowing SQL.
D. All of the aboveAnswer: D) All of the above
Explanation:
The following are the features of QuerySurge ~
i. A big data testing and ETL testing tool, QuerySurge automates the testing process.
ii, It automates the manual process and schedules tests for a specific date and time.
iii. Using this tool, you can create test scenarios and test suits along with configurable
reports without knowing SQL.
Discuss this Question )
\ /
108, Data-centric projects, such as __, etc., require automated ETL testing tools such as
iCEDQ.
A. Warehouses
B. Data migrations
C. Both A and B
D. None of the above
Answer: C) Both A and B
Explanation:
Data-centric projects, such as warehouses, data migrations, etc, require automated ETL testing
tools such as iCEDQ
>
Discuss this Question )
109. Sources and systems are__ by iCEDQ.
A. Verified
B. Validated
C. Coordinated
D. All of the above
Answer: D) All of the above
Explanatio!Sources and systems are verified, validated, and coordinated by iCEDQ.
Discuss this Question )
110. What is/are the feature(s) of iCEDQ?
‘A. We use iCEDQ to compare millions of files and rows of data when we do ETL testing
B. As a result, it is possible to identify exactly which columns and rows contain data errors.
C. iCEDQ compares the data in memory based on the unique columns in the database.
D. All of the above
Answer: D) All of the above
Explanation:
The features of iCEDQ are -
i, We use iCEDQ to compare millions of files and rows of data when we do ETL testing.
ii. As a result, it is possible to identify exactly which columns and rows contain data errors.
ili. iCEDQ compares the data in memory based on the unique columns in the database.
Discuss this Question )
111. ETL and end-to-end testing are offered by QualiDI's__ testing platform.
A. Non-automated
B. Semi-automated
C. Automated
D. None of the above
Answer: C) Automated
Explanation:
ETL and end-to-end testing are offered by QualiDI's automated testing platform.
~
Discuss this Question )
\ )
112. What is/are the feature(s) of QualiDI?‘A. Test cases can be created automatically with QualiDI, and the automated data can be
compared with the manual data
B. Continuous integration is supported
C. Featuring a complex testing cycle, eliminating human error, and managing data quality,
QualiD! manages complex BI testing cycles.
D. All of the aboveA
Answer: D) All of the above
Explanation:
The features of QualiDI are -.
i, Test cases can be created automatically with QualiDI, and the automated data can be
compared with the manual data
ii. Continuous integration is supported.
ili, Featuring a complex testing cycle, eliminating human error, and managing data quality,
QualiD! manages complex BI testing cycles.
Discuss this Question )
/
113. A data migration pipeline involves __ data from an input source, transforming it,
and loading it into an output destination for analysis, reporting, and synchronization
(such as a datamart, database, and data warehouse).
A. Adding
B. Deleting
C. Extracting
D. Modifying
Answer: C) Extracting
Explanation:
A data migration pipeline involves extracting data from an input source, transforming it, and
loading it into an output destination for analysis, reporting, and synchronization (such as a
datamart, database, and data warehouse).
7 >
Discuss this Question )114, What is/are TRUE about ETL Pipelines?
A. In addition to enterprise data warehouses, subject-specific data marts are also built using
ETL pipelines.
B. As a data migration solution, ETL pipelines are also used when replacing traditional
applications with new ones.
C. Industry-standard ETL tools are usually used to construct ETL pipelines that transform
structured data.
D. All of the above
Answer: D) Alll of the above
Explanation:
The things TRUE about ETL Pipelines are -
i. In addition to enterprise data warehouses, subject-specific data marts are also built using
ETL pipelines.
ii, As a data migration solution, ETL pipelines are also used when replacing traditional
applications with new ones.
ili, Industry-standard ETL tools are usually used to construct ETL pipelines that transform
structured data.
)
Discuss this Question }
115. Using data pipelines, one can __, create real-time data streaming applications,
conduct data mining, and build data-driven digital products.
A. Integrate data across applications
B. Build data-driver web products
C. Build predictive models
D. All of the above
Answer: D) All of the above
Explanatio1
Using data pipelines, one can integrate data across applications, build data-driven web
products, build predictive models, create real-time data streaming applications, conduct data
mining, and build data-driven digital products.>
Discuss this Question )
\ /
116. Logs of ETL contain information about disk access, _, and the Microsoft Operating
‘System's performance. They also contain the event of high-frequency events.
A. Page Initials
B, Page faults
C. Pagination
D. Page rows
Answer: B) Page faults
Explanation:
Logs of ETL contain information about disk access, page faults, and the Microsoft Operating
System's performance. They also contain the event of high-frequency events.
Discuss this Question )
eee
117.
___files are also used by the Eclipse Open Development Platform.
psd
etl
pdt
png
GN @>
Answer: B) .etl
Explanation:
ell files are also used by the Eclipse Open Development Platform.
Discuss this Question )
118. What is/are TRUE about Trace Logs?
A. By default, trace providers generate trace logs in their trace session buffers, which are
stored by the operating system.B. A compressed binary format is then used to store trace logs in a log.
C. Both Aand B
D. None of the above
Answer: C) Both A and B
Explanation:
The things TRUE about Trace Logs are -
i. By default, trace providers generate trace logs in their trace session buffers, which are
stored by the operating system
ii. A compressed binary format is then used to store trace logs in a log
(Discuss this Question )
/
119. A product with an ETL___ Mark has been independently tested to meet the
applicable standard.
A. Data
B. Listed
C. Stamp
D. None
Answer: B) Listed
Explanation:
A product with an ETL Listed Mark has been independently tested to meet the applicable
standard,
Discuss this Question )
120. In order to maintain the certification for products with ETL listed marks, regular
product and site inspections are conducted to ensure that the product is manufactured
and matches the __ product.
A. Corrupted
B. Original
C. CopyD. Artificial
Answer: B) Original
Explanation:
In order to maintain the certification for products with ETL-listed marks, regular product and
site inspections are conducted to ensure that the product is manufactured and matches the
original product.
iscuss this Question }
ee
121. Operation(s) that is/are performed in Database Testing?
A. Validating the values of columns in a table is the focus of database testing.
B. Database testing is used to ensure the foreign key or primary key is maintained.
C. During database testing, it is verified whether there is any missing data in a column.
D. All of the above
Answer: D) All of the above
Explanation:
Operations that are performed in Database Testing -
i. Validating the values of columns in a table is the focus of database testing.
ii, Database testing is used to ensure the foreign key or primary key is maintained.
ili, During database testing, it is verified whether there is any missing data in a column.
7 \
Discuss this Question )
122. Which of the following tasks is NOT involved in ETL Transformation Process?
A Filtering
B. Cleaning
C.Joining
D. Addressing
Answer: D) AddressingExplanation:
Addressing is NOT any task involved in ETL Transformation Process.
Discuss this Question )
X /
123. The following point(s) is/are involved in building streaming ETL based on Kafka:
A. Extracting data into Kafka
B. Pulling data from Kafka
C. Load data to other systems
D. All of the above
Answer: D) All of the above
Explanation:
The following points are involved in building streaming ETL based on Kafka:
i, Extracting data into Kafka
ji. Pulling data from Kafka
ili. Load data to other systems
ee
Discuss this Question )
eee
124, The following operation(s) is/are performed in Applying Transformation Logic-
A. Before and after checking the count record, transformation logic is applied.
B. The intermediate table must be validated as the data flows from the staging area
C. Make sure the thresholds for the data are valid
D. All of the above
Answer: D) All of the above
Explanatio!
Following operations are performed in Applying Transformation Logic-
i. Before and after checking the count record, transformation logic is applied.
ii. The intermediate table must be validated as the data flows from the staging areaiii. Make sure the thresholds for the data are valid
Discuss this Question )
125. ____ the data is loaded, transformation logi
A. Before
B. After
C. While
D. None
Answer: A) Before
Explanation:
Before the data is loaded, transformation logic is applied.
Discuss this Question )
/
Learn & Test Your Skills
| Ls meas] [coe mea [mec [esses]
| =| |
[Pan ssn eas]
Comments and Discussions!