-
In-Context Pretraining: Language Modeling Beyond Document Boundaries
Paper • 2310.10638 • Published • 29 -
Magicoder: Source Code Is All You Need
Paper • 2312.02120 • Published • 80 -
Parameter Efficient Tuning Allows Scalable Personalization of LLMs for Text Entry: A Case Study on Abbreviation Expansion
Paper • 2312.14327 • Published • 7 -
WaveCoder: Widespread And Versatile Enhanced Instruction Tuning with Refined Data Generation
Paper • 2312.14187 • Published • 50
sood
atinsood
AI & ML interests
None yet
Recent Activity
updated
a collection
3 days ago
code
updated
a collection
11 days ago
code
updated
a collection
15 days ago
code
Organizations
Collections
1
Papers
2
models
None public yet
datasets
None public yet