YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

hyy33 at WASSA 2024 Track 2

This repository presents our scripts and models for Track 2 in WASSA 2024 of ACL 2024. Pre-trained BERT and DeBERTa models are finetuned using the CombinedLoss and FGM adversarial training for Emotion and Empathy Prediction from Conversation Turns.

Publication

ACL 2024 Workshop WASSA Shared Task: hyy33 at WASSA 2024 Empathy and Personality Shared Task: Using the CombinedLoss and FGM for Enhancing BERT-based Models in Emotion and Empathy Prediction from Conversation Turns

Scripts

Scripts for fine-tuning BERT and DeBERTa in downstream classification and regression tasks. https://github.com/hyy-33/hyy33-WASSA-2024-Track-2/tree/main

Models

The fine-tuned models we submitted for the final result on Track 2, which achieved Pearson correlation of 0.581 for Emotion, 0.644 for Emotional Polarity and 0.544 for Empathy on the test set, with the average value of 0.590 which ranked 4th among all teams.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .