tomaarsen HF staff commited on
Commit
f8169f2
1 Parent(s): 9852076

Upload model

Browse files
Files changed (2) hide show
  1. README.md +32 -40
  2. pytorch_model.bin +1 -1
README.md CHANGED
@@ -16,40 +16,32 @@ metrics:
16
  - recall
17
  - f1
18
  widget:
19
- - text: 'The entourage was the largest ever to accompany an ROC president abroad,
20
- and included: Chuang Ming - yao -LRB- secretary - general, National Security Council
21
- -RRB-, Tien Hung - mao -LRB- minister of foreign affairs -RRB-, Lin Hsin - yi
22
- -LRB- minister of economic affairs -RRB-, Chen Po - chih -LRB- chairman, Council
23
- of Economic Planning and Development -RRB-, Chen Hsi - huang -LRB- chairman, Council
24
- of Agriculture -RRB-, Chung Chin -LRB- head of the Government Information Office
25
- -RRB-, Jeffrey Koo -LRB- chairman of the National Association of Industry and
26
- Commerce -RRB-, Wang Yu - tseng -LRB- chairman of the General Chamber of Commerce
27
- of the ROC -RRB-, and Lin Kun - chung -LRB- chairman of the Chinese National Federation
28
- of Industries -RRB-.'
29
- - text: During the period, IPC monopolized oil exploration inside the Red Line; excluding
30
- Saudi Arabia and Bahrain, where ARAMCO (formed in 1944 by renaming of the Saudi
31
- subsidiary of Standard Oil of California (Socal)) and Bahrain Petroleum Company
32
- (BAPCO) respectively held controlling position.
33
- - text: In the early decades of the 20th century, Benoytosh Bhattacharya – an expert
34
- on Tantra and the then director of the Oriental Institute of Baroda – studied
35
- various texts such as the Buddhist "Sadhanamala "(1156CE), the Hindu "Chhinnamastakalpa
36
- "(uncertain date), and the "Tantrasara "by Krishnananda Agamavagisha (late 16th
37
- century).
38
- - text: A united opposition of fourteen political parties organized into the National
39
- Opposition Union (Unión Nacional Oppositora, UNO) with the support of the United
40
- States National Endowment for Democracy.
41
- - text: Lockheed said the U.S. Navy may also buy an additional 340 trainer aircraft
42
- to replace its T34C trainers made by the Beech Aircraft Corp. unit of Raytheon
43
- Corp.
44
  pipeline_tag: token-classification
45
  co2_eq_emissions:
46
- emissions: 67.50149039261815
47
  source: codecarbon
48
  training_type: fine-tuning
49
  on_cloud: false
50
  cpu_model: 13th Gen Intel(R) Core(TM) i7-13700K
51
  ram_total_size: 31.777088165283203
52
- hours_used: 0.629
53
  hardware_used: 1 x NVIDIA GeForce RTX 3090
54
  base_model: prajjwal1/bert-small
55
  model-index:
@@ -65,13 +57,13 @@ model-index:
65
  split: test
66
  metrics:
67
  - type: f1
68
- value: 0.7438057260629957
69
  name: F1
70
  - type: precision
71
- value: 0.7474561008554705
72
  name: Precision
73
  - type: recall
74
- value: 0.7401908328874621
75
  name: Recall
76
  ---
77
 
@@ -105,8 +97,8 @@ This is a [SpanMarker](https://github.com/tomaarsen/SpanMarkerNER) model trained
105
  ### Metrics
106
  | Label | Precision | Recall | F1 |
107
  |:--------|:----------|:-------|:-------|
108
- | **all** | 0.7475 | 0.7402 | 0.7438 |
109
- | ORG | 0.7475 | 0.7402 | 0.7438 |
110
 
111
  ## Uses
112
 
@@ -118,7 +110,7 @@ from span_marker import SpanMarkerModel
118
  # Download from the 🤗 Hub
119
  model = SpanMarkerModel.from_pretrained("tomaarsen/span-marker-bert-small-orgs")
120
  # Run inference
121
- entities = model.predict("Lockheed said the U.S. Navy may also buy an additional 340 trainer aircraft to replace its T34C trainers made by the Beech Aircraft Corp. unit of Raytheon Corp.")
122
  ```
123
 
124
  ### Downstream Use
@@ -173,7 +165,7 @@ trainer.save_model("tomaarsen/span-marker-bert-small-orgs-finetuned")
173
  | Entities per sentence | 0 | 0.7865 | 39 |
174
 
175
  ### Training Hyperparameters
176
- - learning_rate: 5e-05
177
  - train_batch_size: 128
178
  - eval_batch_size: 128
179
  - seed: 42
@@ -185,16 +177,16 @@ trainer.save_model("tomaarsen/span-marker-bert-small-orgs-finetuned")
185
  ### Training Results
186
  | Epoch | Step | Validation Loss | Validation Precision | Validation Recall | Validation F1 | Validation Accuracy |
187
  |:------:|:----:|:---------------:|:--------------------:|:-----------------:|:-------------:|:-------------------:|
188
- | 0.5720 | 600 | 0.0085 | 0.7230 | 0.6552 | 0.6874 | 0.9641 |
189
- | 1.1439 | 1200 | 0.0078 | 0.7324 | 0.7021 | 0.7169 | 0.9663 |
190
- | 1.7159 | 1800 | 0.0074 | 0.7499 | 0.7213 | 0.7353 | 0.9679 |
191
- | 2.2879 | 2400 | 0.0074 | 0.7611 | 0.7318 | 0.7462 | 0.9701 |
192
- | 2.8599 | 3000 | 0.0072 | 0.772 | 0.7268 | 0.7487 | 0.9700 |
193
 
194
  ### Environmental Impact
195
  Carbon emissions were measured using [CodeCarbon](https://github.com/mlco2/codecarbon).
196
  - **Carbon Emitted**: 0.068 kg of CO2
197
- - **Hours Used**: 0.629 hours
198
 
199
  ### Training Hardware
200
  - **On Cloud**: No
 
16
  - recall
17
  - f1
18
  widget:
19
+ - text: In 2005, Shankel signed with Warner Chappell Music and while pursuing his
20
+ own projects created another joint venture, Shankel Songs and signed Ben Glover,
21
+ "Billboard "'s Christian writer of the Year, 2010, Joy Williams of The Civil Wars,
22
+ and, whom he also produced.
23
+ - text: In 2002, Rodríguez moved to Mississippi and to the NASA Stennis Space Center
24
+ as the Director of Center Operations and as a member of the Senior Executive Service
25
+ where he managed facility construction, security and other programs for 4,500
26
+ Stennis personnel.
27
+ - text: American Motors included Chinese officials as part of the negotiations establishing
28
+ Beijing Jeep (now Beijing Benz).
29
+ - text: La Señora () is a popular Spanish television period drama series set in the
30
+ 1920s, produced by Diagonal TV for Televisión Española that was broadcast on La
31
+ 1 of Televisión Española from 2008 to 2010.
32
+ - text: 'Not only did the Hungarian Ministry of Foreign Affairs approve Radio Free
33
+ Europe''s new location, but the Ministry of Telecommunications did something even
34
+ more amazing: "They found us four phone lines in central Budapest," says Geza
35
+ Szocs, a Radio Free Europe correspondent who helped organize the Budapest location.'
 
 
 
 
 
 
 
 
36
  pipeline_tag: token-classification
37
  co2_eq_emissions:
38
+ emissions: 67.93561835707102
39
  source: codecarbon
40
  training_type: fine-tuning
41
  on_cloud: false
42
  cpu_model: 13th Gen Intel(R) Core(TM) i7-13700K
43
  ram_total_size: 31.777088165283203
44
+ hours_used: 0.52
45
  hardware_used: 1 x NVIDIA GeForce RTX 3090
46
  base_model: prajjwal1/bert-small
47
  model-index:
 
57
  split: test
58
  metrics:
59
  - type: f1
60
+ value: 0.7547025470254703
61
  name: F1
62
  - type: precision
63
+ value: 0.7617641715116279
64
  name: Precision
65
  - type: recall
66
+ value: 0.7477706438380596
67
  name: Recall
68
  ---
69
 
 
97
  ### Metrics
98
  | Label | Precision | Recall | F1 |
99
  |:--------|:----------|:-------|:-------|
100
+ | **all** | 0.7618 | 0.7478 | 0.7547 |
101
+ | ORG | 0.7618 | 0.7478 | 0.7547 |
102
 
103
  ## Uses
104
 
 
110
  # Download from the 🤗 Hub
111
  model = SpanMarkerModel.from_pretrained("tomaarsen/span-marker-bert-small-orgs")
112
  # Run inference
113
+ entities = model.predict("American Motors included Chinese officials as part of the negotiations establishing Beijing Jeep (now Beijing Benz).")
114
  ```
115
 
116
  ### Downstream Use
 
165
  | Entities per sentence | 0 | 0.7865 | 39 |
166
 
167
  ### Training Hyperparameters
168
+ - learning_rate: 0.0001
169
  - train_batch_size: 128
170
  - eval_batch_size: 128
171
  - seed: 42
 
177
  ### Training Results
178
  | Epoch | Step | Validation Loss | Validation Precision | Validation Recall | Validation F1 | Validation Accuracy |
179
  |:------:|:----:|:---------------:|:--------------------:|:-----------------:|:-------------:|:-------------------:|
180
+ | 0.5720 | 600 | 0.0076 | 0.7642 | 0.6630 | 0.7100 | 0.9656 |
181
+ | 1.1439 | 1200 | 0.0070 | 0.7705 | 0.7139 | 0.7411 | 0.9699 |
182
+ | 1.7159 | 1800 | 0.0067 | 0.7837 | 0.7231 | 0.7522 | 0.9709 |
183
+ | 2.2879 | 2400 | 0.0070 | 0.7768 | 0.7517 | 0.7640 | 0.9725 |
184
+ | 2.8599 | 3000 | 0.0068 | 0.7877 | 0.7374 | 0.7617 | 0.9718 |
185
 
186
  ### Environmental Impact
187
  Carbon emissions were measured using [CodeCarbon](https://github.com/mlco2/codecarbon).
188
  - **Carbon Emitted**: 0.068 kg of CO2
189
+ - **Hours Used**: 0.52 hours
190
 
191
  ### Training Hardware
192
  - **On Cloud**: No
pytorch_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:0b2746c1382c0ffeeeefb0de829f38c75ba3ddf9e5de8ad14277e0c443470c2e
3
  size 115096015
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b4daa10ec601a3d8f804bad331c8c9b0b90846ea0e8bc44779c1b3405b163306
3
  size 115096015