Improve model card: Add `transformers` library, link paper, include abstract
#1
by
nielsr
HF Staff
- opened
This PR significantly enhances the model card for Ahma-7B
by:
- Adding
library_name: transformers
to the metadata: This ensures the Hugging Face Hub correctly recognizes the model's compatible library, enabling the "how to use" button and providing relevant code snippets for users. - Linking to the associated research paper: The model card now explicitly references "Scaling Data-Constrained Language Models", which describes the training strategy and research behind the Ahma model. This link is added to the introductory section and updated in the "2-stage pretraining" section for clarity.
- Including the paper abstract: A dedicated "Paper Abstract" section has been added to provide users with immediate context about the research, its motivations, and key findings directly within the model card.
- Removing
inference: false
from metadata: This tag was contradictory, as the model card provides clear inference usage examples. Removing it clarifies that the model is indeed ready for direct inference.
RASMUS
changed pull request status to
merged