RetentionLabs
AI Model Architecture / Memory Research & Project Group
Pinned Loading
Repositories
Showing 10 of 18 repositories
- theRiverLethe Public
AI models with adaptive memory management and strategic forgetting, inspired by Greek mythology and neuroscience.
retentionlabs/theRiverLethe’s past year of commit activity - flash-linear-attention Public Forked from fla-org/flash-linear-attention
🚀 Efficient implementations of state-of-the-art linear attention models
retentionlabs/flash-linear-attention’s past year of commit activity - nested_learning Public Forked from kmccleary3301/nested_learning
A Reproduction of GDM's Nested Learning Paper
retentionlabs/nested_learning’s past year of commit activity - transformers-ttt Public Forked from huggingface/transformers
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal models, for both inference and training.
retentionlabs/transformers-ttt’s past year of commit activity - mixture_of_recursions Public Forked from raymin0223/mixture_of_recursions
Mixture-of-Recursions: Learning Dynamic Recursive Depths for Adaptive Token-Level Computation (NeurIPS 2025)
retentionlabs/mixture_of_recursions’s past year of commit activity - RetentionEngine Public
A simple adapter implementation to transform pretrained Transformer-family models into the Titans architecture.
retentionlabs/RetentionEngine’s past year of commit activity
Top languages
Loading…
Most used topics
Loading…