EXAONE Path LUAD-EGFR Predictor

EGFR classification of LUAD tumors using EXAONE Path 2.0 - a pathology foundation model with end-to-end supervision.

Overview

This model serves as a reference for predicting EGFR mutation status using LUAD (lung adenocarcinoma) tumor images as input. When the model receives an H&E-stained whole slide image as input, it passes through the EXAONE Path 2.0 foundation model, which converts it into a set of features. These features are then integrated into a slide-level feature representation through an aggregator. Finally, a linear classifier predicts the EGFR mutation status (wild-type or mutated).

The model achieves an average performance of AUROC 0.85 on in-house data.

This open-source release aims to demonstrate the effectiveness of EXAONE Path 2.0 on biomarker prediction tasks. It is hoped that this source code will serve as an important reference not only for LUAD EGFR prediction but also as an image-based solution for various disease-related problems, including molecular subtyping, tumor subtyping, and mutation prediction.

Setup

pip install -r requirements.txt

Load model and run inference

from predictor import EXAONEPathLuadEgfrPredictor

hf_token = "YOUR_HUGGING_FACE_ACCESS_TOKEN"
model = EXAONEPathLuadEgfrPredictor.from_pretrained("LG-AI-EXAONE/EXAONE-Path-LUAD-EGFR-Predictor", use_auth_token=hf_token)
svs_path = "samples/EGFR-mutated.svs"
pos_prob = model(svs_path)
svs_path = "samples/EGFR-wild.svs"
neg_prob = model(svs_path)
print(f"EGFR mutation prob of positive sample: {pos_prob:.2f}")
print(f"EGFR mutation prob of negative sample: {neg_prob:.2f}")

Model Performance Comparison

Benchmarks TITAN PRISM CHIEF Prov-GigaPath UNI2-h EXAONE Path 1.0 EXAONE Path EGFR
LUAD-EGFR-USA1 0.754 0.815 0.784 0.709 0.827 0.784 0.853

Contact

LG AI Research Technical Support: [email protected]

License

Copyright (c) LG AI Research

The model is licensed under EXAONEPath AI Model License Agreement 1.0 - NC.

Downloads last month
5
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support