view article Article Back to The Future: Evaluating AI Agents on Predicting Future Events By vinid and 6 others • Jul 17 • 39
CANINE: Pre-training an Efficient Tokenization-Free Encoder for Language Representation Paper • 2103.06874 • Published Mar 11, 2021 • 2
Pixels, Patterns, but No Poetry: To See The World like Humans Paper • 2507.16863 • Published Jul 21 • 68
ZeroSearch: Incentivize the Search Capability of LLMs without Searching Paper • 2505.04588 • Published May 7 • 66
Gemma 3 QAT Collection Quantization Aware Trained (QAT) Gemma 3 checkpoints. The model preserves similar quality as half precision while using 3x less memory • 15 items • Updated Jul 10 • 208