Hyeonseo's picture
Librarian Bot: Add base_model information to model (#2)
e75e664
|
raw
history blame
3.29 kB
metadata
language:
  - ko
tags:
  - generated_from_trainer
metrics:
  - accuracy
widget:
  - text:  회사는 러시아의 톰스크 지역에 있는 베니어 공장에 기계를 납품하기로 되어 있었다.
    example_title: example01
  - text: >-
      새로운 생산공장으로 인해 회사는 예상되는 수요 증가를 충족시킬 수 있는 능력을 증가시키고 원자재 사용을 개선하여 생산 수익성을 높일
      것이다.
    example_title: example02
  - text: >-
      국제 전자산업 회사인 엘코텍은 탈린 공장에서 수십 명의 직원을 해고했으며, 이전의 해고와는 달리 회사는 사무직 직원 수를 줄였다고
      일간 포스티메스가 보도했다.
    example_title: example03
base_model: cardiffnlp/twitter-xlm-roberta-base-sentiment
model-index:
  - name: ko-finance_news_classifier
    results: []

ko-finance_news_classifier

This model is a fine-tuned version of cardiffnlp/twitter-xlm-roberta-base-sentiment on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.4474
  • Accuracy: 0.8423

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 243 1.0782 0.8010
No log 2.0 486 1.0328 0.8381
0.0766 3.0 729 1.2348 0.8330
0.0766 4.0 972 1.3915 0.8052
0.046 5.0 1215 1.2995 0.8474
0.046 6.0 1458 1.2926 0.8361
0.0512 7.0 1701 1.2889 0.8330
0.0512 8.0 1944 1.3107 0.8392
0.0415 9.0 2187 1.4514 0.8309
0.0415 10.0 2430 1.2869 0.8381
0.0279 11.0 2673 1.2874 0.8526
0.0279 12.0 2916 1.4731 0.8423
0.0126 13.0 3159 1.3956 0.8443
0.0126 14.0 3402 1.4211 0.8454
0.0101 15.0 3645 1.3686 0.8474
0.0101 16.0 3888 1.4412 0.8423
0.0114 17.0 4131 1.4376 0.8423
0.0114 18.0 4374 1.4566 0.8423
0.0055 19.0 4617 1.4439 0.8443
0.0055 20.0 4860 1.4474 0.8423

Framework versions

  • Transformers 4.28.0
  • Pytorch 2.0.1+cu118
  • Datasets 2.12.0
  • Tokenizers 0.13.3