Featuredbeginneractive

llama.cpp

LLM inference in C/C++

Author:ggml-org
Stars:89565
Language:C++
Updated:November 11, 2025
Tags:

llama.cpp

LLM inference in C/C++

Overview

License: MIT

Key Features

  • ggml

Statistics

  • ⭐ Stars: 89,565
  • 🍴 Forks: 13,636
  • 📝 Language: C++
  • 📜 License: MIT

Links

Getting Started

Visit the GitHub repository for installation instructions and documentation.


This project information was automatically generated from GitHub. Last updated: 11/11/2025

Related Projects

intermediateactive
412

MobiAgent

The Intelligent GUI Agent for Mobile Phones

By IPADS-SAI
PythonApache-2.0
Featuredintermediateactive
1428

reader3

Quick illustration of how one can easily read books together with LLMs. It's great and I highly recommend it.

By karpathy
Python
intermediateactive
162

PairTranslate

A browser extension for side-by-side translation of web pages

By Cookee24
TypeScriptGPL-3.0