This repository contains official sample applications and code examples for LiteRT (formerly known as TensorFlow Lite), Google's high-performance on-device machine learning framework.
The samples are organized into two main versions (interpreter_api/ and compiled_model_api/) to demonstrate different API paradigms.
Note: For Generative AI and Large Language Models (LLMs), please refer to the LiteRT-LM repository.
This folder contains samples using the LiteRT CompiledModel API. This new API is designed for advanced GPU/NPU acceleration, delivering superior ML & GenAI performance.
- Key Features:
- Hardware Acceleration: Specialized for GPU/NPU execution.
- Async Execution: Improved performance for complex pipelines.
- Buffer Management: efficient input/output handling.
- Available Samples:
- NPU AOT: Ahead-of-Time compilation examples.
- NPU JIT: Just-in-Time compilation examples.
- Platforms: Primarily Android (Kotlin/C++).
This folder contains the CPU samples that use the Interpreter API.
- Key Features:
- Standard
.tflitemodel execution. - Broad compatibility across all Android/iOS versions.
- Legacy Task Library usage.
- Standard
- Available Samples:
- Image Classification: Recognize objects in images/video.
- Object Detection: Locate and label multiple objects.
- Image Segmentation: Separate objects from the background.
- Audio Classification: Identify audio events.
- Digit Classification: Handwritten digit recognition (MNIST).
- Platforms: Android (Kotlin/Java), iOS (Swift/Objective-C), Python (Raspberry Pi/Linux).
- Android: Android Studio (latest stable version).
- iOS: Xcode (latest version).
- Python: Python 3.9+ and
pip install ai-edge-litert.
- Navigate to the
compiled_model_api/directory. - Ensure you have a device with a supported NPU (e.g., modern Pixel, Samsung, or devices with MediaTek/Qualcomm chips).
- Follow the specific setup instructions in the sub-folder to enable the specialized hardware delegates.
- Navigate to
interpreter_api/directory. - Open the project in Android Studio or Xcode.
- Build and run on your device.
- LiteRT Overview: ai.google.dev/edge/litert
- CompiledModel API Guide: LiteRT for Android
- Model Conversion: Convert models to LiteRT
Contributions are welcome!
- Read CONTRIBUTING.md.
- Fork the repo and create a branch.
- Submit a Pull Request.
Apache License 2.0. See LICENSE for details.
Disclaimer: This is a sample repository maintained by Google. It is provided "as is" without warranty of any kind.