You need to agree to share your contact information to access this model
This repository is publicly accessible, but you have to accept the conditions to access its files and content.
This model and associated code are released under the CC-BY-NC-ND 4.0 license and may only be used for non-commercial, academic research purposes with proper attribution. Any commercial use, sale, or other monetization of the FEATHER model and its derivatives, which include models trained on outputs from the FEATHER model or datasets created from the FEATHER model, is prohibited and requires prior approval. Please note that the primary email used to sign up for your Hugging Face account must match your institutional email to receive approval. By downloading the model, you attest that all information (affiliation, research use) is correct and up-to-date. Downloading the model requires prior registration on Hugging Face and agreeing to the terms of use. By downloading this model, you agree not to distribute, publish or reproduce a copy of the model. If another user within your organization wishes to use the FEATHER model, they must register as an individual user and agree to comply with the terms of use. Users may not attempt to re-identify the deidentified data used to develop the underlying model. If you are a commercial entity, please contact the corresponding author.
Log in or Sign Up to review the conditions and access this model content.
Model Card
[Paper] | [Github Repo] | [Cite]
What is FEATHER?
FEATHER is a collection of lightweight supervised foundation models that can easily be finetuned on consumer-grade GPUs, using orders of magnitude less parameters than slide foundation models while achieving competitive performance. It is pretrained on a challenging pan-cancer morphological classification task (PC-108, 108-way classification) on Mass General Brigham (MGB) internal dataset.
This version provides pretrained attention-based MIL (ABMIL) based on on 24K slides from MGB, with UNI patch feature encoder.
Requesting Access
As mentioned in the gated prompt, you must agree to the outlined terms of use, with the primary email for your HuggingFace account matching your institutional email. If your primary email is a personal email (@gmail/@hotmail/@qq) your request will be denied. To fix this, you can: (1) add your official institutional email to your HF account, and confirm your email address to verify, and (2) set your institutional email as your primary email in your HF account. Other reasons for your request access being denied include other mistakes in the form submitted, for example: full name includes abbreviations, affiliation is not spelled out, the described research use is not sufficient, or email domain address not recognized.
License and Terms of Use
â“’ Mahmood Lab. This repository is released under the CC-BY-NC-ND 4.0 license and may only be used for non-commercial, academic research purposes with proper attribution. Any commercial use, sale, or other monetization of this repository is prohibited and requires prior approval. By downloading any pretrained encoder, you agree to follow the model's respective license.
Contact
For any additional questions or comments, contact Faisal Mahmood ([email protected]
),
Daniel Shao ([email protected]
),
or Andrew H. Song ([email protected]
)
Cite
If you find our work useful in your research, please cite our paper:
@inproceedings{shao2025do,
title={Do Multiple Instance Learning Models Transfer?},
author={Shao, Daniel and Chen, Richard J and Song, Andrew H and Runevic, Joel and Lu, Ming Y. and Ding, Tong and and Mahmood, Faisal},
booktitle={International conference on machine learning},
year={2025},
}
- Downloads last month
- 9