Highlights
- Pro
Stars
Development repository for the Triton language and compiler
Puzzles for learning Triton, play it with minimal environment configuration!
Clean, minimal, accessible reproduction of DeepSeek R1-Zero
Visualizer for neural network, deep learning and machine learning models
A generative world for general-purpose robotics & embodied AI learning.
NanoDiffVision explores Differential Attention as a natural evolution of Vision Transformers (ViT), re-implementing and comparing attention mechanisms in compact, resource-efficient models.
Implementation of “DreamDiffusion: Generating High-Quality Images from Brain EEG Signals”
Official Implementations for Paper - MagicQuill: An Intelligent Interactive Image Editing System
Code to train and evaluate Neural Attention Memory Models to obtain universally-applicable memory systems for transformers.
AnchorAttention: Improved attention for LLMs long-context training
A curated list of the most impressive AI papers
Muon optimizer: +~30% sample efficiency with <3% wallclock overhead
Jobs scraper library for LinkedIn, Indeed, Glassdoor, Google & ZipRecruiter
An extremely fast Python package and project manager, written in Rust.