File size: 4,463 Bytes
78ff981 2b0c321 78ff981 8a514eb 78ff981 8a514eb 78ff981 2b0c321 78ff981 8a514eb 78ff981 2b0c321 8a514eb 2b0c321 8a514eb 2b0c321 8a514eb 2b0c321 8a514eb 2b0c321 8a514eb 2b0c321 8a514eb 2b0c321 8a514eb 2b0c321 8a514eb 2b0c321 8a514eb 2b0c321 8a514eb 2b0c321 8a514eb 2b0c321 8a514eb 2b0c321 8a514eb 2b0c321 8a514eb 2b0c321 8a514eb 2b0c321 8a514eb 2b0c321 8a514eb 2b0c321 8a514eb 2b0c321 8a514eb 2b0c321 78ff981 64dffd6 5d4c0d8 64dffd6 78ff981 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 |
---
language: en
license: apache-2.0
tags:
- bart
- seq2seq
- summarization
datasets:
- samsum
widget:
- text: "Jeff: Can I train a \U0001F917 Transformers model on Amazon SageMaker? \n\
Philipp: Sure you can use the new Hugging Face Deep Learning Container. \nJeff:\
\ ok.\nJeff: and how can I get started? \nJeff: where can I find documentation?\
\ \nPhilipp: ok, ok you can find everything here. https://huggingface.co/blog/the-partnership-amazon-sagemaker-and-hugging-face\n"
model-index:
- name: bart-base-samsum
results:
- task:
type: abstractive-text-summarization
name: Abstractive Text Summarization
dataset:
name: 'SAMSum Corpus: A Human-annotated Dialogue Dataset for Abstractive Summarization'
type: samsum
metrics:
- type: rouge-1
value: 46.6619
name: Validation ROUGE-1
- type: rouge-2
value: 23.3285
name: Validation ROUGE-2
- type: rouge-l
value: 39.4811
name: Validation ROUGE-L
- type: rouge-1
value: 44.9932
name: Test ROUGE-1
- type: rouge-2
value: 21.7286
name: Test ROUGE-2
- type: rouge-l
value: 38.1921
name: Test ROUGE-L
- task:
type: summarization
name: Summarization
dataset:
name: samsum
type: samsum
config: samsum
split: test
metrics:
- type: rouge
value: 45.0148
name: ROUGE-1
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZWNlYWIyNzI4MDg5YTcxNzE2NDg3MTBkZGMzMGFmNjVhNDhiMjdiM2YxODdiMDRhZWYyYTdlY2ZkOTZlMThkNyIsInZlcnNpb24iOjF9.hUpQMm2qHUkBPstp7nldJFNy-9B75Z6zunEQCstfGSxIUYXdIlI9u-o0Y9DHIBr4ZLx_CvBtvR2e0shcFFbUBg
- type: rouge
value: 21.6861
name: ROUGE-2
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiOTAwNjdmM2MwMTcxYjNjMTA4ODk4ZDRhODQ1M2UwN2U2ZjM0MDAyZTJhMTRmMTg0ZThiYThiYTJiN2FiYTk1ZiIsInZlcnNpb24iOjF9._QzKtHvIc_oi1VO-Maxofu-LKINnu9NuAwHmLKka_KwEwrTUZkL74zLa-r4ojKNWpRLRicu02L8W_AQafYoZCw
- type: rouge
value: 38.1728
name: ROUGE-L
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNGU0OTEzZTFhMGExOTkzYTI3NzljYjg2YzAxNDM4YzBhM2NjNjI4NWMxYjUwYmFjYzc5YTcxMGVmMTI3YThmMiIsInZlcnNpb24iOjF9.2JgzUAzdOOxUlt8HOWYa8mQuqyRBdyn-LqPiZI-h72zT8mrEO3sIEmmBOvmW40Gf5rvlErYtq87BgxzNwwYUAA
- type: rouge
value: 41.2794
name: ROUGE-LSUM
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiNjI3ODg4YWQ5MjgwZmZkYTMzMGRjMGI2OWU2MDQ0ZDI3MThkZmYzN2U0OGMwMWJlMjhlMTc5YzgwMDBiM2JiZSIsInZlcnNpb24iOjF9.EnYKG7MuM-lNLkKOrlsb6mB94HqOg9sDBG1mCOni8hi7kM0rveSgSDVLk5Z6Adp-cfdRlho8zK-15TJTHJRxAw
- type: loss
value: 1.597476601600647
name: loss
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiZTBmYjJmZDhiYmJiMTcxODM5M2ZmMTBkZTcwYzM2NDFiMDJjNjJhOGMyNGQ3MGI1Y2UxZTBhNTBiMjFjZGZiNyIsInZlcnNpb24iOjF9.UdOhxHcBJGRM-kz46st_vVQR_-KWr9EtsaQnLvj7YjCzE6JqHA2LPXnDogpUQX96PISJj32XoK7jlj-2z-CGBQ
- type: gen_len
value: 17.6606
name: gen_len
verified: true
verifyToken: eyJhbGciOiJFZERTQSIsInR5cCI6IkpXVCJ9.eyJoYXNoIjoiMWNlM2IyY2EzZGNiOWE0ZGMxZmJmZjhmMDI2YzE1YTQ3NmM3OGQ1NjY2ODllYjI5MDllODNhMjNmMWMyMDAyMiIsInZlcnNpb24iOjF9.sewPQx2WKY8IOBgr0XZkmzOzgwsvJko2iK0noBHpgbyWp41akxWHiaxmvipTOLcx7rbIroXQEr_UgE_LMv46Dw
---
## `bart-base-samsum`
This model was obtained by fine-tuning `facebook/bart-base` on Samsum dataset.
## Usage
```python
from transformers import pipeline
summarizer = pipeline("summarization", model="lidiya/bart-base-samsum")
conversation = '''Jeff: Can I train a 🤗 Transformers model on Amazon SageMaker?
Philipp: Sure you can use the new Hugging Face Deep Learning Container.
Jeff: ok.
Jeff: and how can I get started?
Jeff: where can I find documentation?
Philipp: ok, ok you can find everything here. https://huggingface.co/blog/the-partnership-amazon-sagemaker-and-hugging-face
'''
summarizer(conversation)
```
## Training procedure
- Colab notebook: https://colab.research.google.com/drive/1RInRjLLso9E2HG_xjA6j8JO3zXzSCBRF?usp=sharing
## Results
| key | value |
| --- | ----- |
| eval_rouge1 | 46.6619 |
| eval_rouge2 | 23.3285 |
| eval_rougeL | 39.4811 |
| eval_rougeLsum | 43.0482 |
| test_rouge1 | 44.9932 |
| test_rouge2 | 21.7286 |
| test_rougeL | 38.1921 |
| test_rougeLsum | 41.2672 |
|