Data Literacy Guide - DS7
Data Literacy Guide - DS7
A BEGINNER’S GUIDE TO
DATA LITERACY
Sai Ganesh O
Solutions Architect
Rackspace Technology
3
2023 Dell Technologies Proven Professional Knowledge Sharing
Abstract
Today’s organisations are challenged to find new and innovative ways to improve their business
processes, customer experience, and competitive advantage. Currently, that typically means
leveraging transformative technologies to achieve those goals. However, choosing the “right”
technologies has left many companies in the dust as they try to maintain their relevance in an ever-
changing digital environment. When it comes to data modernization, most companies come face-to-
face with four main challenges:
Data literacy and data modernization are becoming increasingly vital for organizations of all sizes
and industries. As data and analytics strategies become essential for all aspects of digital business, it
is essential for organizations to have a workforce that can understand, communicate, and work with
data effectively. Data literacy refers to having the ability to understand and have meaningful
conversations about data, it enables organizations to seamlessly adopt existing and emerging
technologies. It is the key to leveraging business intelligence and making informed decisions.
Gartner defines data literacy as the ability to read, write and communicate data in context, including
an understanding of data sources and constructs, analytical methods and techniques applied, and
the ability to describe the use case, application and resulting value. With data literacy, organizations
can effectively extract valuable insights from data, and make better decisions that can drive results.
On the other hand, data modernization is the process of moving siloed data from legacy databases
to modern cloud-based databases or data lakes. This process is essential for effectively utilizing
massive quantities of data, which may otherwise be unusable, and turning it into actionable insights
for driving results. By moving data from outdated databases to modern ones, organizations can
become more flexible and streamlined, eliminating inefficiencies, delays, and complexities
associated with older systems.
As we know, Artificial Intelligence (AI) has made significant strides, but it still needs data-savvy
workforce that understands how to use data to gain meaningful insights for effective decision
making. This led to the creation of a whole new set of practices for Data Engineering known as
DataOps where its sole purpose is help businesses derive value from their data with improved
productivity and agility. Data Intelligence and Artificial Intelligence are highly related. Data
Intelligence deals with the Extraction, Transformation, and Storing of data, and it is the foremost
step that is done. It is then followed by Artificial Intelligence which then works on processed data
hence can only function after the raw data is Engineered.
Organizations now make use of Analytics to drive decision-making and to understand their
businesses, markets, and customers in the best feasible way. Data Intelligence gives you the insight
you need to understand how your processes work, as we know it is not easy derive favourable
results from siloed and legacy systems that require extensive maintenance and management. Aside
from its increased accessibility and utility, analytics on cloud also offloads many IT demands, such as
hosting and maintaining servers, to cloud service providers that enable business to spend less time
and money on managing their infrastructure and instead focus on bolstering their staff and product.
4
Learn more at www.Dell.com/Certification
In this article, we will discuss the importance of data literacy for businesses, the role of an intelligent
data cloud in daily operations, and the significance of data center modernization in unlocking more
value. Additionally, we will examine how utilizing Artificial Intelligence as a service (AIaaS) can
benefit your business. Businesses that invest in data literacy, data center modernization and modern
data infrastructure can improve their ability to make informed decisions, drive results, and gain a
competitive edge.
Introduction
A modern data architecture is essential for harnessing the intelligence that drives insights and action
in today's enterprise. Data modernization is a key component of this, allowing businesses to improve
their processes by making them faster, smarter, and more efficient using newer technologies.
Data modernization includes updating an organization's data infrastructure by moving siloed data
from traditional databases to more advanced and efficient cloud-based databases or data lakes. This
is crucial for effectively utilizing enormous amounts of data, which may otherwise be unusable, and
turning it into actionable insights for driving results. As a result of data modernization, organizations
become more flexible and streamlined, eliminating inefficiencies, delays, and complexities
associated with older systems. In simpler terms, data modernization is the process of migrating data
from outdated databases to modern ones.
Before delving into the topic, it is important to understand the significance of data literacy for
organizations. Data literacy, as defined by Gartner, refers to the ability to read, write, and
communicate data in context, including knowledge of data sources and structures, analytical
methods, and the ability to describe the use and value of data. With the increasing amount of data
being generated and collected daily, it is becoming increasingly important for individuals and
organizations to be able to make sense of this information and use it to make informed decisions.
One of the main reasons data literacy is important is that it allows individuals and organizations to
make better decisions. By being able to understand and analyze data, they can identify patterns and
trends, make predictions, and gain a deeper understanding of the issues they are facing. This can
lead to more efficient and effective decision making, which can have a significant impact on a
business or organization's bottom line. It also allows for better communication of data and its value
to non-technical stakeholders and enables organizations to understand how to use data to drive
growth and innovation. Therefore, data literacy is not just about understanding numbers and
statistics, it is also about understanding the context and the meaning behind the data. This includes
understanding the sources of data, the assumptions and biases that might be present, and the
limitations of the data.
In practical terms, this means that businesses need to have a comprehensive understanding of the
data that is available to them, including how it can be used, what its limitations may be, and how to
effectively combine and enrich it with information from other sources. They must be familiar with
the concepts of geospatial context, and how it can be applied to generate deeper insights and drive
better decisions. Business users should also understand the importance of proactively managing
data quality to ensure accuracy and reliability.
However, there are several barriers that can impede an organization's ability to achieve data
literacy. One major barrier is the lack of a skilled workforce that can understand the basics concepts
of data, including several types of data available, types of analysis that can be used, how to maintain
data hygiene, and the various tools, techniques, and frameworks that can be leveraged. To
5
2023 Dell Technologies Proven Professional Knowledge Sharing
overcome this barrier, organizations can create a data literacy program that provides essential tools
and training to employees’ organization-wide, ensuring that they have the necessary skills to reach
the desired level of data literacy for their job.
Another barrier to achieving data literacy is poor data quality. Data quality tends to erode over time
due to factors such as customer relocation, name changes, and data entry errors. To overcome this
barrier, businesses must proactively manage data quality using enterprise-grade tools.
Data silos, or lack of integration between different data systems, is another barrier to achieving data
literacy. Reliable, scalable, and flexible data integration tools can help organizations overcome this
barrier by allowing them to adapt as new systems and data sources are brought online.
Finally, missing context can also be a barrier to achieving data literacy. Data without context has
little value, so it is important for organizations to ensure that data is enriched with context to gain a
complete understanding of their customers, target audience, and other key stakeholders.
Firstly, it is essential to have a clear understanding of the key members and the roles they play on
the data team within an organization. The data team can be structured in various ways, depending
on the organization’s size and the level of significance placed on the use of data in day-to-day
activities. Ideally, most data teams include:
• Data scientists – who leverage advanced mathematics, programming, and tools to conduct
and manage large-scale analyses
• Data engineers – who are responsible for building and maintaining datasets that are
leveraged in data projects
• Data analysts – who conduct most of the analyses an organization requires
2. Data Discovery
Data exploration and discovery is a crucial step in developing data literacy. It empowers
professionals such as data scientists and business analysts to blend and examine various types of
data and analytics to identify ways to improve business operations and uncover new business
opportunities. It enables them to explore different types of data and analytics before implementing
any solutions in a business intelligence and decision-making environment. Data discovery provides a
new and improved method for designing and constructing analytic solutions. However, various
factors must be considered when assessing the use of data for a particular data discovery project
and choosing the best data discovery platform and tools. The main ones are:
6
Learn more at www.Dell.com/Certification
Data Analysis
Data analysis is the process of examining and interpreting data to gain insights and understanding.
This can be accomplished using a variety of techniques, including statistical models, algorithms, and
other complex tools and frameworks, or it can be as simple as reviewing the data and drawing
conclusions. There are many different types of data analysis, and four of the most common are:
• Descriptive analysis, which seeks to describe and summarize what has happened
• Diagnostic analysis, which seeks to understand and explain why something has happened
• Predictive analysis, which uses data to make forecasts about what may happen in the future
• Prescriptive analysis, which recommends a course of action to achieve a specific outcome.
3. Data Wrangling
Data wrangling is an essential step in the data preparation process that involves transforming raw,
unstructured data into a format that is ready for analysis and visualization. It is also commonly
known as data cleaning, as it involves a series of tasks that aim to remove errors, inconsistencies,
and outliers, and fill in missing values to make the data more accurate and reliable. Data wrangling
can take many forms and can be done through various tools and techniques, depending on the data
source, format, and quality.
The most common examples include data validation, data transformation, data integration, data
normalization, and data aggregation. It is a critical step in reducing errors and inaccuracies in the
analysis that typically follows it. In many organizations, data wrangling is done automatically through
various algorithms and other tools, but every employee responsible for generating, capturing, or
uploading data also plays a role in ensuring it meets the organization’s requirements. This process
ensures the integrity and consistency of the data, making it more valuable and usable for decision
making and strategic planning.
4. Data Science
Data science is a field that uses various techniques, including statistics, computing, and algorithms,
to extract knowledge and insights from data. It combines math, programming, analytics, AI, and
machine learning with domain expertise to uncover hidden insights in data. Practitioners apply
machine learning to various types of data to create AI systems that can perform tasks that usually
require human intelligence. These systems provide insights that can be converted into business
value by analysts and business users. The data science life cycle consists of six phases described
below
7
2023 Dell Technologies Proven Professional Knowledge Sharing
• Business Understanding: The first phase is to define the business problem and understand
the domain and the desired solution.
• Data Collection: The next step is to gather relevant data that can be used to solve the
business problem.
• Data Preparation: This phase involves cleaning and shaping the data for further analysis and
8modelling.
• Exploratory Data Analysis: In this phase, data is analyzed using summary statistics and
visualizations to understand patterns and relationships among variables.
• Model Building: After understanding the data, models are built to provide insights and make
predictions.
• Model Deployment and Maintenance: The final phase is to deploy the model to the real
world and monitor and maintain its performance over time.
This is the life cycle of a Data Science project that occurs in iterations. These steps are repeated until
a good model giving good results to the business problem gets achieved.
5. DataOps
DataOps is a methodology that aims to improve the efficiency and effectiveness of data-driven
software development by leveraging machine learning techniques. It focuses on creating and
maintaining a central hub for data collection, storage, and distribution. This central hub serves as the
foundation for data management, allowing for agile, automated, and secure data processes.
DataOps is heavily influenced by the principles of Agile, DevOps, and Lean Manufacturing. Agile
methodology can be used to work with big data and make quick business decisions. DevOps
practices can be used to break down silos between development and operations teams, making
software development and deployment faster and more collaborative. Lean manufacturing
strategies can be used to minimize waste and increase efficiency in data pipelines.
By implementing DataOps, data teams can reduce the time spent on finding the right data or
bringing data science models into production. DataOps allows data teams to deploy models
themselves and perform analysis quickly, eliminating dependencies on other teams. Additionally,
8
Learn more at www.Dell.com/Certification
DataOps can significantly decrease the time data engineers spend building and maintaining data
pipelines. Overall, DataOps enables data teams to work more efficiently and effectively, resulting in
faster and more accurate decision making.
6. Data Visualization
Data visualization is an essential tool for effectively communicating insights and making data more
accessible to others. It is the process of creating graphical or visual representations of data using
various techniques such as charts, tables, maps, infographics, and even videos or GIFs. By using
these techniques, data can be presented in a way that is both meaningful and easy to understand,
even for those who may not be data literate.
There are different types of data that can be visualized, each with its own unique characteristics and
best practices. Qualitative data, for example, is a measure of ‘types’ and may be represented by a
name, symbol, or a number code. It is often used interchangeably with categorical data, which is a
measurement expressed by natural language descriptions such as favorite color = orange. On the
other hand, quantitative data is a measure of values or counts and is expressed as numbers, always
associated with a scale measure where these metrics can be counted, ordered, and aggregated.
For each type of data, there are specific visualization techniques that are best suited to specific
analytic needs. For example, line charts are typically used to track changes or trends over time and
show the relationship among variables, while bar charts are used to compare quantities of different
categories. Additionally, there are other types of visualization like heatmap, scatter plots, and pie
charts which are useful for different types of data and analysis. By understanding the best practices
for visualizing different types of data and the insights that can be gleaned from each type of
visualization, data analysts and professionals can effectively communicate their findings to others.
9
2023 Dell Technologies Proven Professional Knowledge Sharing
7. Data Governance
• Quality is the process of ensuring that the data used by the organization is accurate,
trustworthy, and complete. This includes implementing procedures for data validation, data
quality checks and regular data audits.
• Security is the process of protecting the organization’s data from unauthorized access, theft,
and other forms of malicious activity. This includes implementing security protocols such as
encryption and access controls, as well as performing regular security audits and
vulnerability assessments.
• Privacy is the process of protecting sensitive information that the organization may collect
and store. This includes customer financial information, employee records, and other
sensitive data. The organization must ensure that it adheres to relevant privacy laws and
regulations, and that appropriate measures are in place to protect this data from
unauthorized access or misuse.
• Stewardship is the process of ensuring that the organization’s data processes are followed
appropriately. This includes implementing procedures for data management, data archiving,
and data retention. It also includes ensuring that data is used ethically and responsibly, and
that the organization’s data policies and procedures are followed by all employees.
Overall, data governance is a critical aspect of any organization and it is designed to ensure that data
is used effectively, efficiently, and ethically. It is a continuous process that requires regular review
and updating to keep up with the organization’s needs and the changing landscape of data
management.
10
Learn more at www.Dell.com/Certification
8. The Data Ecosystem
Finally, the data ecosystem is a comprehensive term that encompasses the various components and
systems that an organization employs to collect, store, process, and analyze data. This includes both
the physical infrastructure, such as servers, storage devices, and network equipment, as well as the
non-physical components, such as data sources, programming languages, code packages, algorithms,
and software.
The physical infrastructure plays a crucial role in the data ecosystem as it provides the necessary
resources for storing and processing large amounts of data. It includes different types of storage
solutions such as traditional on-premises storage, cloud storage, and hybrid storage solutions. The
servers and other hardware components are also an important part of the physical infrastructure, as
they provide the computational power needed to process and analyze data.
The non-physical components of the data ecosystem include data sources, programming languages,
code packages, algorithms, and software that are used to collect, analyze, and make sense of the
data. These components are essential for extracting insights and knowledge from the data. Data
sources can be internal or external and can come in various forms such as structured or unstructured
data. The programming languages, code packages, and algorithms used in the data ecosystem are
important for data cleaning, transformation, and analysis. Software such as data visualization tools
and business intelligence platforms are also used to present the insights and knowledge derived
from the data in a meaningful way.
Each organization's data ecosystem is unique, tailored to their specific needs and requirements, but
there can be some similarities if they use the same data sources or tools. Understanding an
organization's data ecosystem can help identify potential areas for optimization and how all the
components fit together. This knowledge can be used to make strategic decisions and improve the
overall efficiency of the data management process. The topic of ideal infrastructure for data
processing will be discussed in more detail in the next section.
In the same way that AI is an umbrella term for intelligence, Data Science is an umbrella term for
insights from data as discussed previously. Data Science is a set of methods and practices for
gathering insights from data. Data scientists use various statistical techniques to process and
transform these data sets. To achieve their goals, Data scientists deploy tools such as Tableau,
Python Programming Language, MATLAB, TensorFlow Statistics, Natural Language Processing (NLP),
and many more.
11
2023 Dell Technologies Proven Professional Knowledge Sharing
AI can be divided into three main categories:
• Artificial Narrow Intelligence (ANI): This type of AI is designed to perform specific tasks
efficiently. ANI systems are highly specialized, and their capabilities are limited to specific
areas. They are designed to perform one single task, and they can excel in that task but in a
very controlled environment and with limited parameters. Examples of ANI include Siri,
Alexa, and self-driving cars.
• Artificial General Intelligence (AGI): This type of AI is a theoretical concept that aims to
create machines that have a human-like intelligence across a variety of tasks and abilities.
AGI would be able to understand natural language, recognize objects and images, make
decisions, and solve problems like humans. AGI is considered a more ambitious goal than
ANI, and it is still in the research stage.
• Artificial Super Intelligence (ASI): This is the most advanced theory made for Artificial
Intelligence. ASI is the idea that AI will surpass human thinking capability by constantly
adapting and being able to do multiple tasks at once. This is a more futuristic concept, and
the computational capability required for this type of AI has not yet been reached.
Data Science and Artificial Intelligence are closely interconnected, with data as a common
denominator. Data Engineering, which involves extracting, transforming, and storing data, is the first
step in this process. Artificial Intelligence in Data Science applications uses processed data, so it can
only be used after the raw data has been engineered. The image below illustrates the products that
result from the combination of these three concepts. Machine Learning is a product of Data Science
and Artificial Intelligence, in which a small amount of data is used to predict outcomes.
12
Learn more at www.Dell.com/Certification
Image Source - https://www.freshworks.com/freshservice/itsm/preparing-for-ai-in-itsm/
Data science and AI are driving some of the most significant technological advancements of today.
For example, data science is used to enhance the precision of predictive models in healthcare, while
AI is employed to create more realistic virtual assistants and chatbots. Additionally, data science and
AI are utilized to develop new products and services such as personalized recommendations and
autonomous vehicles.
The main obstacle in data science and AI is the vast quantity of data that is being generated
currently. With the proliferation of the internet, social media, and other digital technologies, we are
collecting more data than ever before. Therefore, data scientists and AI researchers must develop
new methods to manage and analyze this data.
Overall, data science and AI are transforming the way we live and work and will continue to shape
our future in exciting ways. As these fields continue to evolve, it will be important to stay informed
and engaged in the discussions around their development and use.
With the advent of cloud-based technologies, organizations are now turning towards modern data
infrastructure solutions. Cloud-based data infrastructure allows organizations to store and process
data in the cloud, which offers several advantages over traditional on-premises systems. The main
advantage is scalability, as organizations can easily scale their data infrastructure to accommodate
the increasing volume of data being generated. Additionally, cloud-based infrastructure is more cost-
effective as organizations only pay for the resources, they use, and it provides better security and
compliance as organizations can leverage the security features provided by the cloud providers.
One of the most significant trends in data infrastructure is the shift towards cloud-native
technologies. Cloud-native technologies are specifically designed for the cloud and include cloud-
native databases, data lakes, and data warehousing solutions. These technologies are built to take
13
2023 Dell Technologies Proven Professional Knowledge Sharing
full advantage of the scalability and cost-effectiveness of the cloud and can be more easily
integrated with other cloud-based services such as analytics and machine learning tools. This
enables organizations to store, process, and analyze their data in new ways, and to better meet the
demands of today's data-driven business environment.
Another trend in data infrastructure is the increasing use of containers and Kubernetes. Containers
are a lightweight form of virtualization that allow organizations to package their applications and
dependencies together and run them on any infrastructure. Kubernetes is an open-source container
orchestration system that helps organizations manage and scale their containerized applications. By
using containers and Kubernetes, organizations can easily deploy and scale their data infrastructure
in the cloud while also improving security and compliance. This enables organizations to run their
data infrastructure on-demand, reducing costs and increasing agility.
Edge computing is also becoming a popular trend in data infrastructure. Edge computing involves
processing data closer to where it is being generated, such as at the edge of a network or in a
remote location. This reduces latency and improves the performance of data-intensive applications
such as real-time analytics and IoT. By processing data at the edge, organizations can also improve
the security of their data infrastructure as sensitive data can be processed and stored locally rather
than being sent to a central location.
Moreover, Artificial Intelligence (AI), Machine Learning (ML) and Analytics are becoming an integral
part of data infrastructure. With the help of AI and ML, organizations can extract insights from the
data, which can be used to make better-informed decisions. Analytics provides organizations with
the ability to visualize and analyze data, providing a deeper understanding of the data and enabling
organizations to identify patterns and trends. Cloud-based data infrastructure provides the
necessary computing power and storage capacity to run AI and ML models, making it an ideal
platform for organizations that want to leverage these technologies.
One of the most significant benefits of AIaaS and machine learning for data literacy is the ability to
analyze large amounts of data. With the explosion of data being generated, organizations are faced
with the challenge of making sense of it all. AIaaS can help by providing advanced analytics tools that
can process and analyze large data sets, identify patterns, and make predictions. This can help
organizations gain valuable insights that can inform business decisions and drive revenue growth.
For example, retail companies can use AIaaS to predict customer buying patterns and optimize
inventory, accordingly, resulting in cost savings from reduced overstocking and markdown expenses.
Another benefit of AIaaS for data literacy is the ability to automate repetitive tasks. Many businesses
still rely on manual processes to complete tasks such as data entry, data analysis, and report
14
Learn more at www.Dell.com/Certification
generation. AIaaS can automate these tasks, freeing up time for employees to focus on more
important tasks. This can help increase efficiency, improve the accuracy of data, and reduce the risk
of human error. By automating these tasks, organizations can save time and money, and increase
productivity. For example, AIaaS can automatically categorize and classify emails, freeing up time for
employees to focus on more important tasks and resulting in cost savings from reduced labor
expenses.
AIaaS can also help organizations improve customer engagement by providing personalized
experiences. With the help of AI, organizations can analyze customer data to gain insights into their
preferences, behavior, and needs. This can help businesses create personalized content, offers, and
services that are tailored to the specific needs of their customers. By providing personalized
experiences, businesses can increase customer satisfaction, loyalty, and retention, which can result
in cost savings from reduced customer acquisition expenses.
AIaaS can also improve security. With the increasing amount of data being stored and shared, data
security has become a major concern for businesses. AIaaS can help by providing advanced security
features such as anomaly detection, facial recognition, and biometric authentication. This can help
prevent data breaches, and financial losses from cyberattacks, resulting in cost savings.
AIaaS is a game-changer for data literacy. It provides businesses with a wide range of benefits that
can help them make better decisions, improve efficiency, drive revenue, automate repetitive tasks,
improve customer engagement, and improve security. By leveraging the capabilities of AIaaS,
organizations can gain valuable insights from their data and make more informed decisions.
Additionally, AIaaS can help businesses save costs by reducing expenses related to inventory, labor,
customer acquisition and security. The ability to access AI technology and capabilities through the
cloud has made it more accessible to small and medium-sized businesses, giving them the ability to
compete with larger organizations while saving costs.
Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP) are three of the
most prominent and well-established companies in the field of Artificial Intelligence as a Service
(AIaaS). These companies have been instrumental in providing a comprehensive range of AI-based
services to businesses worldwide, thereby playing a significant role in the growth and development
of the AIaaS market. These companies offer a diverse set of tools and technologies such as bots,
Application Programming Interfaces (APIs), and machine learning frameworks, which are designed to
enhance the operations and improve the overall performance of businesses. Furthermore, they
provide fully managed machine learning options, thereby enabling companies to seamlessly
integrate AI into their systems without having to worry about the technical aspects of the
implementation process.
With the increasing demand for AIaaS, other notable technology firms such as Salesforce, Oracle,
and SAP have also entered the market and are offering their own AIaaS solutions. Additionally, there
are a plethora of start-ups that are focusing on specific aspects of AIaaS, and it is a common practice
for larger companies to acquire these smaller companies to add their developed services to their
portfolios and expand their offerings.
15
2023 Dell Technologies Proven Professional Knowledge Sharing
Conclusion
The article provides a detailed overview of the relationship between data science and artificial
intelligence (AI), and their importance in the current data-driven world. It explains that AI plays a
crucial role in data science by providing advanced tools for predictive analysis and parameters for
data engineering, enabling data scientists to extract valuable insights from data and make better
decisions. The article also emphasizes the growing importance of data literacy for individuals and
organizations, as it allows them to understand and work with data to make informed decisions,
advance their careers, and navigate the world around them. Investing in data literacy can have a
significant impact on an individual's or organization's bottom line and future.
The article also highlights the trend towards data center modernization, which is the process of
upgrading and updating data center infrastructure and technologies to improve efficiency,
scalability, and security. This can include updating hardware and software, implementing cloud-
native technologies, adopting virtualization, and implementing automation and orchestration tools.
The modernization of data centers is essential for organizations to keep pace with the growing
volume of data and the increasing demand for data-driven insights.
Additionally, the article mentions the trend towards cloud-native technologies, containers,
Kubernetes, and edge computing, which are replacing traditional data infrastructure and enabling
organizations to store, process, and analyze their data in new ways, and to better meet the demands
of today's data-driven business environment. Additionally, it explains how AI as a Service (AIaaS) is a
game-changer for data literacy, providing businesses with a wide range of benefits such as the ability
to make better decisions, improve efficiency, drive revenue, automate repetitive tasks, improve
customer engagement, and improve security. By leveraging the capabilities of AIaaS, organizations
can gain valuable insights from their data and make more informed decisions. Furthermore, AIaaS
can help businesses save costs by reducing expenses related to inventory, labor, customer
acquisition, and security. The ability to access AI technology and capabilities through the cloud has
made it more accessible to small and medium-sized businesses, giving them the ability to compete
with larger organizations while saving costs. In conclusion, the article emphasizes the importance of
data literacy, data center modernization, data modernization, and the use of AI and cloud-based
technologies for organizations of all sizes and industries.
16
Learn more at www.Dell.com/Certification
References
17
2023 Dell Technologies Proven Professional Knowledge Sharing