We built this library at takara.ai to bring attention mechanisms and transformer layers to Go — in a form that's lightweight, clean, and dependency-free.
We’re proud to say that every part of this project reflects what we set out to do.
- Pure Go — no external dependencies, built entirely on the Go standard library - Core support for DotProductAttention and MultiHeadAttention - Full transformer layers with LayerNorm, feed-forward networks, and residual connections - Designed for edge, embedded, and real-time environments where simplicity and performance matter
Thank you to everyone who has supported this so far — the stars, forks, and feedback mean a lot.