Featuredintermediateactive
flash-linear-attention
🚀 Efficient implementations of state-of-the-art linear attention models
Author:fla-org
Stars:3834
Language:Python
Updated:November 14, 2025
flash-linear-attention
🚀 Efficient implementations of state-of-the-art linear attention models
Overview
Key Features
- large-language-models
- machine-learning-systems
- natural-language-processing
Statistics
- ⭐ Stars: 3,834
- 🍴 Forks: 301
- 📝 Language: Python
- 📜 License: MIT
Links
Getting Started
Visit the GitHub repository for installation instructions and documentation.
This project information was automatically generated from GitHub. Last updated: 11/14/2025