-
Blending Is All You Need: Cheaper, Better Alternative to Trillion-Parameters LLM
Paper • 2401.02994 • Published • 51 -
Sleeper Agents: Training Deceptive LLMs that Persist Through Safety Training
Paper • 2401.05566 • Published • 30 -
TrustLLM: Trustworthiness in Large Language Models
Paper • 2401.05561 • Published • 70 -
Zero Bubble Pipeline Parallelism
Paper • 2401.10241 • Published • 25
Josh Fourie
JoshFourie
AI & ML interests
None yet
Recent Activity
upvoted
a
paper
2 days ago
Taming the Titans: A Survey of Efficient LLM Inference Serving
upvoted
a
paper
2 days ago
Softpick: No Attention Sink, No Massive Activations with Rectified
Softmax
Organizations
None yet
Collections
5
models
0
None public yet
datasets
0
None public yet