![tensorflow logo](https://raw.githubusercontent.com/github/explore/80688e429a7d4ef2fca1e82350fe8e3517d3494d/topics/tensorflow/tensorflow.png)
- Montpellier & Lille, France
- youtube.com/@alexandretl
- @AlexandreTL2
Highlights
- Pro
Lists (1)
Sort Name ascending (A-Z)
Starred repositories
Ongoing research training transformer models at scale
🌾 OAT: A research-friendly framework for LLM online alignment, including preference learning, reinforcement learning, etc.
Clean, minimal, accessible reproduction of DeepSeek R1-Zero
PyPSA-GB: An open source model of the GB transmission network for simulating future energy scenarios
Training Large Language Model to Reason in a Continuous Latent Space
The official implementation of Tensor ProducT ATTenTion Transformer (T6)
Drawings and calculations of our open source biogas design
Unofficial implementation of Titans, SOTA memory for transformers, in Pytorch
Free library to model physical systems with bond graphs.
A generative world for general-purpose robotics & embodied AI learning.
[ICLR 2025] Official PyTorch Implementation of Gated Delta Networks: Improving Mamba2 with Delta Rule
Code to train and evaluate Neural Attention Memory Models to obtain universally-applicable memory systems for transformers.
NanoDiffVision explores Differential Attention as a natural evolution of Vision Transformers (ViT), re-implementing and comparing attention mechanisms in compact, resource-efficient models.
Puzzles for learning Triton, play it with minimal environment configuration!
Development repository for the Triton language and compiler
AnchorAttention: Improved attention for LLMs long-context training
The official implementation of MARS: Unleashing the Power of Variance Reduction for Training Large Models
A framework for few-shot evaluation of language models.