0% found this document useful (0 votes)
16 views

MLFlow

MLflow provides components for tracking experiments, packaging projects, and deploying models in a standardized manner. It captures key metrics, parameters, and model artifacts for monitoring, while allowing for the organization of ML code and dependencies. Additionally, MLflow supports model deployment across various platforms, ensuring versatility and reproducibility in machine learning workflows.

Uploaded by

Dastagiri Saheb
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views

MLFlow

MLflow provides components for tracking experiments, packaging projects, and deploying models in a standardized manner. It captures key metrics, parameters, and model artifacts for monitoring, while allowing for the organization of ML code and dependencies. Additionally, MLflow supports model deployment across various platforms, ensuring versatility and reproducibility in machine learning workflows.

Uploaded by

Dastagiri Saheb
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 4

• MLflow Tracking - Logs key metrics, parameters,

models, and other artifacts when running ML code to


monitor experiments
• MLflow Projects - Configurable standard format for
organizing ML code to ensure consistency and
reproducibility
• MLflow Models - Package ML model files with their
dependencies so they can be deployed on diverse
platforms
he MLflow Tracking component captures:
Parameters, metrics, model files, and output logs.
This includes the input parameters to your model, performance metrics (such
as accuracy or loss), model artifacts (like the trained model files), and any
output logs generated during training.

MLflow Models allow the packaging of:


Models with their libraries, files, and other dependencies.
This packaging enables easy deployment of models in various environments, ensuring that all
necessary components are included for the model to run successfully.

MLflow Models can be deployed to multiple platforms, including PyTorch, ONNX, Scikit-Learn, and
others, making it a versatile tool for model deployment across different frameworks.

This feature allows users to track, visualize, and compare different experiment runs, making it easier
to analyze the performance of various models and configurations.
Summary MLflow components for tracking experiments, packaging reproducible projects, and
deploying models in a standardized way.
Key Points
•Tracking – Logs parameters, metrics, models to monitor runs
•Projects – Standardize ML code organization and dependencies
•Models – Package models and dependencies for deployment
Reflection Questions
1.How could you use the tracking UI to compare model experiments?
2.Why is it useful for MLflow projects to specify software environments?
3.What model deployment platforms does MLflow support?
Challenge Exercises
4.Use MLflow tracking with local model training experiments
5.Containerize an MLflow Project to ensure software consistency
6.Deploy a registered MLflow model and request real-time inferences

•Conda Environment - Specifies the dependencies and software required to recreate the
runtime environment
•Entry Points - Define the scripts that can be executed within the project workflow
•Git Repository - Remote repository that contains an MLflow project enabling portability
•mlflow run - Executes an MLflow project locally or from a Git repo with given parameters

You might also like