adaamko's picture
Update README.md
358bde4 verified
metadata
license: mit
language:
  - en
base_model:
  - jhu-clsp/ettin-encoder-17m
pipeline_tag: token-classification
tags:
  - token classification
  - hallucination detection
  - retrieval-augmented generation
  - transformers
  - ettin
  - lightweight
datasets:
  - ragtruth
  - KRLabsOrg/rag-bioasq-lettucedetect
library_name: transformers

TinyLettuce (Ettin-17M): Efficient Hallucination Detection

TinyLettuce

Model Name: tinylettuce-ettin-17m-en

Organization: KRLabsOrg

Github: https://github.com/KRLabsOrg/LettuceDetect

Ettin encoders: https://arxiv.org/pdf/2507.11412

Overview

TinyLettuce is a lightweight token‑classification model that flags unsupported spans in answers given context (span aggregation performed downstream). Built on the 17M Ettin encoder, it targets real‑time CPU inference and low‑cost domain fine‑tuning with synthetic data. This variant is trained synthetic data and on the RAGTruth dataset for hallucination detection, using the 17M Ettin encoder and a token‑classification head. Designed for CPU‑friendly inference and simple deployment.

Model Details

  • Architecture: Ettin encoder (17M) + token‑classification head
  • Task: token classification (0 = supported, 1 = hallucinated)
  • Input format: [CLS] context [SEP] question [SEP] answer [SEP], up to 4096 tokens
  • Language: English; License: MIT

Training Data

  • RAGTruth + our synthetic data generated with LettuceDetect, span‑level labels
  • ~20k training samples

Training Procedure

  • Tokenizer: AutoTokenizer; DataCollatorForTokenClassification; label pad −100
  • Max length: 8k; batch size: 16; epochs: 5
  • Optimizer: AdamW (lr 1e‑5, weight_decay 0.01)
  • Hardware: Single A100 80GB

Results (RAGTruth)

This model is designed primarily for fine-tuning on smaller, domain-specific samples, rather than for general use (though it still performs notably on Ragtruth).

Model Parameters F1 (%)
TinyLettuce-17M 17M 68.52
LettuceDetect-base (ModernBERT) 150M 76.07
LettuceDetect-large (ModernBERT) 395M 79.22
Llama-2-13B (RAGTruth FT) 13B 78.70

Usage

You can use the model with the lettucedetect library.

First install lettucedetect:

pip install lettucedetect

Then use it:

from lettucedetect.models.inference import HallucinationDetector

# Load tiny but powerful model
detector = HallucinationDetector(
    method="transformer", 
    model_path="KRLabsOrg/tinylettuce-ettin-17m-en"
)

# Detect hallucinations in medical context
spans = detector.predict(
    context=[
        "Ibuprofen is an NSAID that reduces inflammation and pain. The typical adult dose is 400-600mg every 6-8 hours, not exceeding 2400mg daily."
    ],
    question="What is the maximum daily dose of ibuprofen?",
    answer="The maximum daily dose of ibuprofen for adults is 3200mg.",
    output_format="spans",
)
print(spans)
# Output: [{"start": 51, "end": 57, "text": "3200mg"}]

Citing

If you use the model or the tool, please cite the following paper:

@misc{Kovacs:2025,
      title={LettuceDetect: A Hallucination Detection Framework for RAG Applications}, 
      author={Ádám Kovács and Gábor Recski},
      year={2025},
      eprint={2502.17125},
      archivePrefix={arXiv},
      primaryClass={cs.CL},
      url={https://arxiv.org/abs/2502.17125}, 
}