OperationalExcellence Vfinal V11
OperationalExcellence Vfinal V11
2
Table of Contents
Document Control ...................................................................................................................................... 2
1. Glossary ............................................................................................................................................... 4
2. Introduction ......................................................................................................................................... 5
3
1. Glossary
Term Description
DMP Data Marketplace
GSB Government Service Bus
NDB National Data Bank
NDC National Data Catalog
NDI National Data Index
NDL National Data Lake
NDMO National Data Management Office
NIC National Information Center
ODP Open Data Platform
OE Operational Excellence
RDP Reference Data Platform
SDAIA Saudi Data and Artificial Intelligence Authority
4
2. Introduction
The Saudi Data and Artificial Intelligence Authority (SDAIA) is on a mission of developing a
data-driven economy and fostering the data literacy level across the government agencies
for operational efficiency and decision-making in the Kingdom of Saudi Arabia (KSA). To this
end, SDAIA has launched the National Data Index (NDI) framework that aims at measuring
the efforts and progress in transforming the data into a vital economic resource for unlocking
innovation, driving economic growth and transformation, and improving the national
competitiveness in an organized and accelerated manner.
NDI encompasses the National Data Management Office (NDMO) Data Management and
Personal Data Protection Framework prescribed domains and is composed of three
essential components: Compliance, Maturity, and Operational Excellence (OE). Each
government entity will be measured on a regular basis across these three components.
This document is developed specifically to collate all the commonly asked questions
pertaining to OE in a single place. This document provides clear and concise answers for
these questions to help concerned people obtain further clarity about the OE metrics.
5
What is the relationship of OE with the National Data Index (NDI)?
NDI consists of three essential components: Compliance, Maturity, and OE.
• Government Service Bus (GSB): A central platform that enables the government
agencies in the Kingdom to integrate and interlink to share standardized transactional
data on-demand for facilitating government business processes.
• National Data Lake (NDL): A reliable central repository for preserving, processing, and
cleansing national data and then sharing it securely with the beneficiaries to enable
them to build decision-support platforms.
• Collaborative Data Labs (CDL): Secure labs that enable employees of government
agencies to access the data hosted in the National Data Lake through an
6
environment equipped with the latest data analytics and AI technologies to discover,
explore, and analyze the data and generate the required insights and reports.
• Data Marketplace (DMP): A platform that aims at automating the data-sharing
processes across the agencies in the Kingdom. It enables them to browse the data-
sharing services and subscribe to what suits them in an automated manner in
accordance with the national data governance policies.
• National Data Catalog (NDC): A platform that serves as an inventory of the metadata
of the government agencies’ systems, with the definitions of their Key Performance
Indicators (KPIs) and metrics and the list of business-critical data fields linked with
their certified sources, in addition to the data policies and standards.
• Reference Data Platform (RDP): A platform that offers the features to support
extensive curation capabilities for standardizing, classifying, and defining the
ownership for the reference data at a national-level across the government agencies
and ensures completeness, accuracy, and consistency of the available reference
data.
• Open Data Portal (ODP): A platform that enables individuals, government, and non-
government agencies to publish their open data and make it available to end users,
such as entrepreneurs, to enable them to develop innovative products that contribute
to building a digital economy in the Kingdom.
How many metrics are targeted for the first year/round of the assessment
and what is the plan for the remaining metrics?
For the first year, six metrics are targeted which are listed below:
7
DO.OE.03 Operational issues from entities encountered by NDL NDL
8
also help agencies to focus on the areas of improvement. However, it is important to note
that the weights for the metrics can possibly be changed in the subsequent rounds and will
remain unchanged for the current round.
Is there any requirement for submitting proofs or evidences from the agency
pertaining to OE metrics?
No, there is no expectation from the agencies to submit any materials. The National Data
Platforms shall be used to collect the evidences and required data for progress
measurement.
9
Integration Following Integration Data Updated
System
Implemented? Method? Regularly?
System 1 Yes Yes Yes
System 2 Yes No Yes
System 3 Yes Yes No
System 4 Yes Yes Yes
System 5 No No No
Based on the integration status above, only System 1 and System 4 are considered
integrated and hence the metric score is calculated as follows:
Score = (Fully Integrated Systems / Total Number of Systems) * 100 = 2/5 * 100 = 40%
Scale Interval = Unacceptable
API Number of Calls on the API Number of Failed Calls the API
API 1 100 3
API 2 80 5
API 3 150 6
API 4 200 7
10
API 5 30 7
Score = (Total Number of Failed Calls on All APIs / Total Number of Calls on All APIs) * 100
= (3+5+6+7+7) / (100+80+150+200+30) * 100 = (28/560) * 100 = 5%
Scale Interval = Unacceptable
If an execution cycle is triggered and the job is failed, the NDL operations team tries it multiple
times to complete the data acquisition. In case two or more errors occur due to an issue
from the agency side in that cycle, it is considered as an operational issue. It is important to
note that multiple failures in an execution cycle is counted as only one issue. For example,
suppose that the data acquisition job is scheduled for a daily run, and it fails to connect with
the agency’s staging database designated for NDL due to database unavailability. The NDL
operations team will re-run it again and in case the job fails to complete successfully two or
more times, it will be classified as an operational issue and counted as one failure for that
execution cycle. Note that a maximum of one operational issue will be counted per each
table per day. This metric considers only the issues arising from the agency side only (e.g.,
database unavailability, altering the data types or column names, missing table).
11
Total Failed
Final Operational
Table Day Execution Execution
Status Issues Count
Tries Tries
Table 1 Day 1 3 3 Failed 1
Table 1 Day 2 2 1 Successful 0
Table 1 Day 3 3 2 Successful 1
Table 1 Day 4 – Day 29 26 0 Successful 0
Table 1 Day 30 3 3 Failed 1
Table 2 Day 1 2 1 Successful 0
Table 2 Day 2 2 1 Successful 0
Table 2 Day 3 - Day 30 28 0 Successful 0
Failure Percentage = Total Failed Pipelines Executions / Total Pipelines Executions * 100
= 3 / 60 * 100 = 5%
Scale Interval = Fair
12
Score = Number of Datasets Published in ODP / (Total Number of Datasets Required to be
Published / 2) * 100 = 30 / (80/2) * 100 = 75%
Scale Interval = Low
13
Please show an example of how the metric MCM.OE.01 is calculated?
Suppose the agency has six critical systems and one of them (i.e., System 5) cannot be
cataloged due to technical difficulties in extracting its technical metadata from its own
databases. Now, suppose the agency has provided the technical metadata for four of the
remaining five systems only (i.e., System 1, System 2, System 4, and System 6).
System 1 Yes
System 2 Yes
System 3 No
System 4 Yes
System 5 N/A
System 6 Yes
14
Physical Attribute
Attribute Definition Business Constraint
Column Name Name
• It must be less than the
The date of which the
Dt_of_birth Date of Birth current date
person was born
• It cannot have a future date
The numerical code
Blood Group • Codes must be from the
Blood_grp_cd assigned to the blood
Code defined list of blood groups
group
Number of
Number of Required Number of Defined
System Technical
Business Attributes Business Attributes
Columns
System 1 270 27 21
System 2 1080 108 92
System 3 360 36 29
15
The completion percentage of System 1 is calculated as follows = Number of Defined
Business Attributes / Number of Required Business Attributes * 100 = 21 / 27 * 100 = 77%
Similarly, the completion percentage of System 2 and System 3 are 85% and 80%
respectively. Hence, the overall score of the metric is calculated as the average completion
percentages of the three systems as follows:
Note that agencies are not required to link the business attributes with technical columns
for the first year.
16
17