You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

This model and associated code are released under the CC-BY-NC-ND 4.0 license and may only be used for non-commercial, academic research purposes with proper attribution. Any commercial use, sale, or other monetization of the FEATHER model and its derivatives, which include models trained on outputs from the FEATHER model or datasets created from the FEATHER model, is prohibited and requires prior approval. Please note that the primary email used to sign up for your Hugging Face account must match your institutional email to receive approval. By downloading the model, you attest that all information (affiliation, research use) is correct and up-to-date. Downloading the model requires prior registration on Hugging Face and agreeing to the terms of use. By downloading this model, you agree not to distribute, publish or reproduce a copy of the model. If another user within your organization wishes to use the FEATHER model, they must register as an individual user and agree to comply with the terms of use. Users may not attempt to re-identify the deidentified data used to develop the underlying model. If you are a commercial entity, please contact the corresponding author.

Log in or Sign Up to review the conditions and access this model content.

Model Card

[Paper] | [Github Repo] | [Cite]

What is FEATHER?

FEATHER is a collection of lightweight supervised foundation models that can easily be finetuned on consumer-grade GPUs, using orders of magnitude less parameters than slide foundation models while achieving competitive performance. It is pretrained on a challenging pan-cancer morphological classification task (PC-108, 108-way classification) on Mass General Brigham (MGB) internal dataset.

This version provides pretrained attention-based MIL (ABMIL) based on on 24K slides from MGB, with CONCH v1.5 patch feature encoder.

Requesting Access

As mentioned in the gated prompt, you must agree to the outlined terms of use, with the primary email for your HuggingFace account matching your institutional email. If your primary email is a personal email (@gmail/@hotmail/@qq) your request will be denied. To fix this, you can: (1) add your official institutional email to your HF account, and confirm your email address to verify, and (2) set your institutional email as your primary email in your HF account. Other reasons for your request access being denied include other mistakes in the form submitted, for example: full name includes abbreviations, affiliation is not spelled out, the described research use is not sufficient, or email domain address not recognized.

Model Usage

FEATHER uses CONCH v1.5 patch features with patch size of 512x512 pixels at 20x magnification.

Option 1. Load through MIL-Lab

After installing MIL-Lab, pretrained models can be initialized either with a state_dict or with AutoModel

# construct the model from src and load the state dict from HuggingFace
create_model('abmil.base.uni.pc108-24k', num_classes=5)

# or with HuggingFace's AutoModel using from_pretrained
create_model('abmil.base.uni.pc108-24k', from_pretrained=True, num_classes=5)

FEATHER models do not include a classification head. Obtain the appropriate output dimension for your needs by specifying num_classes

Option 2. Load through Trident

Feather-24k with CONCHv1.5 feature encoder is integrated into Trident.

License and Terms of Use

â“’ Mahmood Lab. This repository is released under the CC-BY-NC-ND 4.0 license and may only be used for non-commercial, academic research purposes with proper attribution. Any commercial use, sale, or other monetization of this repository is prohibited and requires prior approval. By downloading any pretrained encoder, you agree to follow the model's respective license.

Contact

For any additional questions or comments, contact Faisal Mahmood ([email protected]), Daniel Shao ([email protected]), or Andrew H. Song ([email protected])

Cite

If you find our work useful in your research, please cite our paper:

@inproceedings{shao2025do,
    title={Do Multiple Instance Learning Models Transfer?},
    author={Shao, Daniel and Chen, Richard J and Song, Andrew H and Runevic, Joel and Lu, Ming Y. and Ding, Tong and and Mahmood, Faisal},
    booktitle={International conference on machine learning},
    year={2025},
}
Downloads last month
11
Safetensors
Model size
788k params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including MahmoodLab/abmil.base.conch_v15.pc108-24k