0% found this document useful (0 votes)
7 views10 pages

Operationalize AI at Scale With MLOps Ventana Research Ebook DDL-NVIDIA

Organizations are increasingly recognizing the value of AI, reporting benefits such as competitive advantage and improved customer experiences, with 72% planning to expand their AI usage. However, challenges remain in developing and deploying AI models due to the need for skilled resources and scalable infrastructure. Implementing MLOps platforms can enhance collaboration between data scientists and IT, ensuring effective governance and optimized performance for AI initiatives.

Uploaded by

anonym.mtl.ca
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views10 pages

Operationalize AI at Scale With MLOps Ventana Research Ebook DDL-NVIDIA

Organizations are increasingly recognizing the value of AI, reporting benefits such as competitive advantage and improved customer experiences, with 72% planning to expand their AI usage. However, challenges remain in developing and deploying AI models due to the need for skilled resources and scalable infrastructure. Implementing MLOps platforms can enhance collaboration between data scientists and IT, ensuring effective governance and optimized performance for AI initiatives.

Uploaded by

anonym.mtl.ca
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

Operationalize AI at Scale

with MLOps
Closing the Divide Between Data Scientists and IT

Sponsored by Domino Data Lab, NVIDIA and AMD


Organizations are Expanding AI Deployments
Benefits of Artificial Intelligence
Organizations are proving that artificial intelligence (AI)
Competitive advantage and customer experience top list
is valuable. Our research shows that the most common
benefits reported by participants include gaining competitive
5%
advantage and improving customer experiences. Gained competitive

Participants also report being able to respond faster


8% advantage
Improved customer
to opportunities and to threats in the market, and 31% experience
10% Increased sales
an improvement in the bottom line, with increased sales
Faster response
and lower costs. to opportunities
15% and threats

Reduced errors
It should come as no surprise that nearly three-quarters and mistakes

(72%) of participants plan to increase their use of AI, 31% Lowered costs

and none reported that they were planning to decrease


usage. Our research shows AI being used in an array
of business functions, but it is most prevalent in sales
organizations, followed closely by R&D, IT, marketing, akeaway: Look for valuable AI use cases
finance and product management. that abound in every industry and functional area
in order to reap more of its benefits.

1 2 3 4 5 6 7 8 9 10
But AI Can Be Challenging

While AI provides significant benefits, it is not without its challenges.


The development of accurate AI models requires significant amounts
of data, scalable infrastructure and highly skilled resources. Yet only
slightly more than one-half (55%) of participants in our research indicate
that they are knowledgeable or expert in applying AI as part of their
analytic processes.

Then, once an initial model is developed, it must be deployed into an


operational application to capture its benefits. It must also be maintained
to ensure that it continues to be accurate as data and market conditions
change. One-quarter of our research participants (26%) report one of
these two tasks as their most significant challenge.

akeaway: AI requires continuous resource commitment


and investment to ensure applications consistently deliver
expected benefits.

1 2 3 4 5 6 7 8 9 10
Data Scientists Require Flexibility

Data scientists developing AI models work with a variety


of open-source tools, including R, Python and others, or they
may work with proprietary tools. Some prefer code-first
environments, while others prefer code-free environments.
No matter, they have particular technologies they prefer,
leading to a variety of development environments making
collaboration, tracking and repeatability challenging.

It’s hard enough to find skilled data science resources;


don’t stifle your data science resources by forcing them
to change tools. Enable them to use the tools they prefer,
but do so in a way that provides governance, flexibility
and collaboration throughout the various tasks in the
data science lifecycle. Platforms that provide tracking and
akeaway: Choose platforms that
repeatability not only provide the governance and security
provide flexibility, collaboration and governance
required, but also increase productivity.
to maximize adoption and productivity.

1 2 3 4 5 6 7 8 9 10
Workloads are Intensive and Variable

Whether evaluating life science research, streamlining


customer service or detecting fraud, data science workloads
can vary significantly, from feature engineering to model
development to model scoring. Peak utilization can be orders
600K

of magnitude greater than average utilization. If provisioning 500K

400K

300K

these resources is too complex, they may be configured 200K

100K

7am 8am 9am 10am 11am 12pm 1pm 2pm 3pm 4pm 5pm 6pm

for peak utilization and left idle or underutilized preventing


others from using these valuable resources.

In particular, model development is a resource-intensive


process. Computationally intensive algorithms such as deep
learning increase requirements even further. Infrastructure
must be optimized to support the iteration necessary to
maximize model accuracy. Therefore, organizations should
akeaway: Provision for peak workloads
provide scalable, elastic resources that can meet peak
to accelerate time-to-value.
utilization but can be reallocated at other times. Anything
less could limit the value of the AI models developed.

1 2 3 4 5 6 7 8 9 10
Purpose-built AI Improves Infrastructure

The intensity of data science workloads and their divergence from other enterprise applications calls for specialized
computing infrastructure. GPUs provide highly parallel processing that accelerates model development and deployment.
Given the volume of data transfers in model training, high-bandwidth memory and fast networking also help speed
training iterations and time to insight. AI development and deployment also both require significant scalability and
modularity to match the variable workloads.

This specialized infrastructure itself requires a premium over standard computing


infrastructure. Productivity of data science resources is critical—wasting
time on low value activity such as securing or provisioning infrastructure
slows innovation to a crawl. Platforms that can share infrastructure
among multiple teams and projects can help organizations
increase their return on investment.

akeaway: Recognize the special requirements of AI workloads and


enable them with purpose-built infrastructure that supports the consolidated
demands of multiple teams across the organization.

1 2 3 4 5 6 7 8 9 10
MLOps Brings Agility to AI Innovation

Data science projects must integrate with an organization’s IT and applications infrastructure, and organizations must
plan from the outset for the deployment of the models developed by the data science teams across the enterprise.
The discipline required in the deployment requires close interaction between data scientists and an organization’s
IT DevOps (development operations) team to manage frequent updates to applications.

MLOps (machine learning operations) is the data science complement to DevOps. MLOps
platforms orchestrate the collection of artifacts, compute infrastructure, and processes
necessary to deploy and maintain AI-based models. These include the data pipelines
that feed the models, as well as the models themselves. MLOps platforms also
incorporate evaluation of the ongoing accuracy of models, retraining and
redeploying models as necessary.

akeaway: Use an MLOps platform to plan for close


interaction between data science and IT DevOps to increase
productivity and deploy a greater number of models into
production faster.

1 2 3 4 5 6 7 8 9 10
Consider Data Gravity

Given the demanding computational requirements for data science


workloads, organizations should consider data gravity, or the proximity
of the data to purpose-built AI infrastructure. Many organizations
continue to leverage the cloud, but our research indicates that nearly
two-thirds (64%) of data workloads are on-premises.

Location of the data has a big influence on performance and needs to


be tailored to the use case. Cloud is a viable option, so long as the data
is there. Otherwise, a lot of time and money is expended working against
the force of data gravity.

Furthermore, the location of training data may be different from where


models are deployed. In such cases, model development can be separated
from model deployment, for instance, in cloud-based applications.

akeaway: Minimize data movement to accelerate


model development.

1 2 3 4 5 6 7 8 9 10
Governance and Control Are Critical

AI-based models help perform sophisticated analyses,


processes that can have significant organizational ramifications.
As a result, these processes require governance and control
to ensure compliance with both internal policies and
government regulations. For example, organizations may
need to document credit-granting decisions or ensure
data privacy in model development.

Governance should be built into data science activities


at the start, rather than added after the fact. There are many
artifacts involved in the processes, requiring the work of
many people. An MLOps platform provides enterprises with
a good governance backbone that not only helps ensure
compliance, but also helps improve the efficiency of the
processes, making them more repeatable. In the end, this
means both less risk and faster time-to-value. akeaway: Plan for and engineer good
governance into your AI efforts from the start.

1 2 3 4 5 6 7 8 9 10
Three Takeaways to Help
Drive AI Success KEYS TO AI SUCCESS
AI provides significant value to organizations and is becoming a competitive
necessity. Every organization should be developing its AI strategy, and there
are many lessons that can be learned from the experiences of others. Purpose-built AI
Infrastructure
First, purpose-built AI infrastructure optimized with an MLOps platform
can help increase success, drive utilization of resources and improve
time-to-value for data science projects.
MLOps Platform
Second, an MLOps platform is critical for quickly governing, developing,
deploying and monitoring AI models in production applications.

Third, purpose-built AI infrastructure and MLOps platforms need to work


together, enabling technology, processes and the agility necessary to Plan for Scale
support AI at scale.

Sponsored by Support also provided by AMD’s


EPYC Processor Business Division.

Ventana Research Dynamic Insights and Benchmark Research reports


can be found at www.ventanaresearch.com.
© 2021 Ventana Research

1 2 3 4 5 6 7 8 9 10

You might also like