Falcon-H1 Collection Falcon-H1 Family of Hybrid-Head Language Models, including 0.5B, 1.5B, 1.5B-Deep, 3B, 7B, and 34B (pretrained and instruction-tuned). • 37 items • Updated 18 days ago • 39
view article Article Building an Open Ecosystem for Time Series Forecasting: Introducing TimesFM in Hugging Face By Nutanix and 1 other • 19 days ago • 16
view article Article The N Implementation Details of RLHF with PPO By vwxyzjn and 2 others • Oct 24, 2023 • 58
Falcon Edge series Collection A series of powerful, universal and fine-tunable small Language Models • 7 items • Updated 18 days ago • 22
Learning Dynamics in Continual Pre-Training for Large Language Models Paper • 2505.07796 • Published 26 days ago • 19
Generating Physically Stable and Buildable LEGO Designs from Text Paper • 2505.05469 • Published about 1 month ago • 27
LLMs for Engineering: Teaching Models to Design High Powered Rockets Paper • 2504.19394 • Published Apr 27 • 13
Self-Generated In-Context Examples Improve LLM Agents for Sequential Decision-Making Tasks Paper • 2505.00234 • Published May 1 • 26
A Survey on Inference Engines for Large Language Models: Perspectives on Optimization and Efficiency Paper • 2505.01658 • Published May 3 • 35