CE-7_512_MSELoss / README.md
TrishanuDas's picture
Upload Backbone with LinearHead of num_class=1, without the sigmoid function
626654f verified
metadata
language: en
license: mit
tags:
  - cross-encoder
  - text-similarity
  - text-classification

Usage

from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch

# Load tokenizer
tokenizer = AutoTokenizer.from_pretrained("TrishanuDas/CE-7_512_MSELoss")

# Load model weights
model = AutoModelForSequenceClassification.from_pretrained("TrishanuDas/CE-7_512_MSELoss")

# Prepare input
inputs = tokenizer("Query", "Document", return_tensors="pt", padding=True, truncation=True)

# Get prediction
with torch.no_grad():
    # Get logits
    outputs = model(**inputs)
    logits = outputs.logits
    
    # Apply sigmoid to get probabilities
    scores = torch.sigmoid(logits)

Important Note

When loading this model, you need to manually apply the sigmoid function to the logits as shown in the example above.