🧠 DistilBERT Prompt Classifier

This is a fine-tuned DistilBERT model for classifying prompt types as either "user prompt" or "system prompt". It is useful for distinguishing between different roles in conversation-based datasets like those used in chatbots, assistants, or instruction tuning.

✨ Model Details

  • Model Name: distilbert-prompt-classifier
  • Developed by: Mayuresh Mane
  • Base Model: distilbert-base-uncased
  • Task: Text Classification (Binary)
  • Labels: 0 = system prompt, 1 = user prompt
  • Language: English
  • License: Apache 2.0
  • Framework: πŸ€— Transformers

πŸ”— Model Sources

πŸ’‘ Uses

βœ… Direct Use

You can use this model to classify any single prompt into either a system or user prompt.

🚫 Out-of-Scope Use

  • Not intended for multi-language prompt classification.
  • May not generalize well to noisy or adversarial text outside of prompt-type formatting.

πŸ§ͺ How to Use

from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch
import torch.nn.functional as F

model_name = "rushi-shaharao/distilbert-prompt-classifier"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForSequenceClassification.from_pretrained(model_name)

prompt = "You are a helpful assistant."
inputs = tokenizer(prompt, return_tensors="pt", truncation=True, padding=True)

with torch.no_grad():
    outputs = model(**inputs)
    probs = F.softmax(outputs.logits, dim=1)
    predicted_class = torch.argmax(probs, dim=1).item()

label_map = {0: "system prompt", 1: "user prompt"}
print(f"Predicted: {label_map[predicted_class]} ({probs[0][predicted_class]:.2f} confidence)")
Downloads last month
141
Safetensors
Model size
67M params
Tensor type
F32
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support