0% found this document useful (0 votes)
63 views9 pages

Datateam Reading Book Club 2025

The Reading Book Club 2025 focuses on upskilling in AI and related technologies, emphasizing the importance of learning for career growth and innovation. Participants will explore various AI engineering concepts through selected books, including practical techniques for developing AI applications and leveraging large language models. The first meeting will be held to choose a book and outline the club's implementation strategy.

Uploaded by

rhuanca
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
63 views9 pages

Datateam Reading Book Club 2025

The Reading Book Club 2025 focuses on upskilling in AI and related technologies, emphasizing the importance of learning for career growth and innovation. Participants will explore various AI engineering concepts through selected books, including practical techniques for developing AI applications and leveraging large language models. The first meeting will be held to choose a book and outline the club's implementation strategy.

Uploaded by

rhuanca
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

You are invited to the

Reading Book Club 2025


Compulse & 360ia Edition
Upskilling!!!
Why?

● Learning comes before


Innovation
● Career Growth
● It’s challenging

2
Focus!!!
What?
● AI
● New stuff
Let's lead, not chase tails
● Hands-On

3
The books

4
Recent breakthroughs in AI have not only increased demand for AI products, they've also lowered the
barriers to entry for those who want to build AI products. The model-as-a-service approach has transformed
AI from an esoteric discipline into a powerful development tool that anyone can use. Everyone, including
those with minimal or no prior AI experience, can now leverage AI models to build applications. In this book,
Option 1 author Chip Huyen discusses AI engineering: the process of building applications with readily available
foundation models.

The book starts with an overview of AI engineering, explaining how it differs from traditional ML engineering
and discussing the new AI stack. The more AI is used, the more opportunities there are for catastrophic
failures, and therefore, the more important evaluation becomes. This book discusses different approaches
to evaluating open-ended models, including the rapidly growing AI-as-a-judge approach.

AI application developers will discover how to navigate the AI landscape, including models, datasets,
evaluation benchmarks, and the seemingly infinite number of use cases and application patterns. You'll
learn a framework for developing an AI application, starting with simple techniques and progressing toward
more sophisticated methods, and discover how to efficiently deploy these applications.

● Understand what AI engineering is and how it differs from traditional machine learning engineering
● Learn the process for developing an AI application, the challenges at each step, and approaches
to address them
● Explore various model adaptation techniques, including prompt engineering, RAG, fine-tuning,
agents, and dataset engineering, and understand how and why they work
● Examine the bottlenecks for latency and cost when serving foundation models and learn how to
overcome them
● Choose the right model, dataset, evaluation benchmarks, and metrics for your needs

Chip Huyen works to accelerate data analytics on GPUs at Voltron Data. Previously, she was with Snorkel
AI and NVIDIA, founded an AI infrastructure startup, and taught Machine Learning Systems Design at
Stanford. She's the author of the book Designing Machine Learning Systems, an Amazon bestseller in AI.

AI Engineering builds upon and is complementary to Designing Machine Learning Systems (O'Reilly). 5
Large language models (LLMs) have proven themselves to be powerful tools
for solving a wide range of tasks, and enterprises have taken note. But
Option 2 transitioning from demos and prototypes to full-fledged applications can be
difficult. This book helps close that gap, providing the tools, techniques, and
playbooks that practitioners need to build useful products that incorporate the
power of language models.

Experienced ML researcher Suhas Pai offers practical advice on harnessing


LLMs for your use cases and dealing with commonly observed failure modes.
You’ll take a comprehensive deep dive into the ingredients that make up a
language model, explore various techniques for customizing them such as
fine-tuning, learn about application paradigms like RAG (retrieval-augmented
generation) and agents, and more.

● Understand how to prepare datasets for training and fine-tuning


● Develop an intuition about the Transformer architecture and its variants
● Adapt pretrained language models to your own domain and use cases
● Learn effective techniques for fine-tuning, domain adaptation, and
inference optimization
● Interface language models with external tools and data and integrate
them into an existing software ecosystem

6
Book Description

Throughout this book, you will learn data engineering, supervised fine-tuning, and
Option 3 deployment. The hands-on approach to building the LLM Twin use case will help you
implement MLOps components in your own projects. You will also explore cutting-edge
advancements in the field, including inference optimization, preference alignment, and
real-time data processing, making this a vital resource for those looking to apply LLMs in
their projects.

What you will learn

● Implement robust data pipelines and manage LLM training cycles


● Create your own LLM and refine it with the help of hands-on examples
● Get started with LLMOps by diving into core MLOps principles such as orchestrators
and prompt monitoring
● Perform supervised fine-tuning and LLM evaluation
● Deploy end-to-end LLM solutions using AWS and other tools
● Design scalable and modularLLM systems
● Learn about RAG applications by building a feature and inference pipeline

Who this book is for

This book is for AI engineers, NLP professionals, and LLM engineers looking to deepen their
understanding of LLMs. Basic knowledge of LLMs and the Gen AI landscape, Python and
AWS is recommended. Whether you are new to AI or looking to enhance your skills, this
book provides comprehensive guidance on implementing LLMs in real-world scenarios 7
What is next?

We'll meet on Friday


- To decide what book to read.
- To define how we will implement this reading book club.

8
Link to books
https://drive.google.com/drive/folders/1Dg9Gm93K_WEHK0916jPEJvaPpTS6DOgv

You might also like