Model Card for Model ID

Model Details

Model Description

How to use the model

import pandas as pd
import numpy as np
from transformers import AutoModelForSequenceClassification, AutoTokenizer

# Load model
model = AutoModelForSequenceClassification.from_pretrained("lkonle/EMO_Love_gbert")

# Load tokenizer
tokenizer = AutoTokenizer.from_pretrained("lkonle/EMO_Love_gbert")
tokenizer.pad_token = "[PAD]"
tokenizer.add_special_tokens({'pad_token': '[PAD]'})

# define input text
myinput = ["Paul war sehr sehr glücklich über seinen Welpen.",
           "Paul war sehr traurig über sein Frühstück.",
           "Paul hatte große Langeweile."]

# tokenize, encode, format as batch and return pytorch tensors
input_ids = tokenizer.batch_encode_plus(myinput, truncation=True, padding="max_length", padding_side="right", return_tensors="pt")

# predict
logits = model(**input_ids)["logits"]

# get the predicted label
result = logits.detach().numpy()
prediction = np.argmax(result, axis=1)

# store result in pandas
output = pd.DataFrame()
output["inputs"] =  myinput
output["prediction"] = prediction
print(output)
Downloads last month
5
Safetensors
Model size
336M params
Tensor type
F16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support