Featuredintermediateactive

flash-linear-attention

🚀 Efficient implementations of state-of-the-art linear attention models

Author:fla-org
Stars:3834
Language:Python
Updated:November 14, 2025

flash-linear-attention

🚀 Efficient implementations of state-of-the-art linear attention models

Overview

hf_model Discord

Key Features

  • large-language-models
  • machine-learning-systems
  • natural-language-processing

Statistics

  • ⭐ Stars: 3,834
  • 🍴 Forks: 301
  • 📝 Language: Python
  • 📜 License: MIT

Links

Getting Started

Visit the GitHub repository for installation instructions and documentation.


This project information was automatically generated from GitHub. Last updated: 11/14/2025

Related Projects

beginneractive
467

dots.llm1

The official repository of the dots.llm1 base and instruct models proposed by rednote-hilab.

By rednote-hilab
intermediateactive
412

MobiAgent

The Intelligent GUI Agent for Mobile Phones

By IPADS-SAI
PythonApache-2.0
Featuredintermediateactive
1428

reader3

Quick illustration of how one can easily read books together with LLMs. It's great and I highly recommend it.

By karpathy
Python