The most atomic way to train and inference a GPT in pure, dependency-free C
-
Updated
Feb 15, 2026 - C
The most atomic way to train and inference a GPT in pure, dependency-free C
GPT in a QR Code ; The actual most atomic way to train and inference a GPT in pure, dependency-free JS/Python.
Interactive visualization of a minimal GPT implementation with autograd engine.
A complete GPT built from scratch in TypeScript with zero dependencies
The essence of text diffusion in ~150 lines of pure Python. Inspired by Karpathy's MicroGPT.
"Everything else is just for efficiency." — Karpathy's microgpt benchmarked across scalar autograd, NumPy, and PyTorch (RTX 5080)
Composable tiny intelligence from microgpt. Pure Python. Zero dependencies.
microGPT
Generating Colour Palettes Randomly
A Swift port of Karpathy’s microgpt. Runs natively on Apple Silicon GPUs via Metal Performance Shaders Graph.
Minimal char-level GPT inspired by @karpathy's microGPT: multi-dataset runner for names, Pokémon, Shakespeare, Paul Graham essays and more. Educational, hackable, no fluff.
A conversion of Karpathy's MicroGPT to Typescript
Minimal GPT in 300 lines of pure Lua — a port of Karpathy's microgpt
Add a description, image, and links to the microgpt topic page so that developers can more easily learn about it.
To associate your repository with the microgpt topic, visit your repo's landing page and select "manage topics."