devmanpreet commited on
Commit
eea6b6f
·
verified ·
1 Parent(s): 9e43320

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +36 -3
README.md CHANGED
@@ -1,3 +1,36 @@
1
- ---
2
- license: apache-2.0
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ datasets:
4
+ - pietrolesci/pubmed-200k-rct
5
+ metrics:
6
+ - accuracy
7
+ base_model:
8
+ - openai-community/gpt2
9
+ tags:
10
+ - medical
11
+ - biology
12
+ - research
13
+ - pubmed
14
+ ---
15
+
16
+ # MedGPT — GPT-2 Fine-Tuned on PubMed RCT
17
+
18
+ MedGPT is a GPT-2 model fine-tuned on the `pubmed-200k-rct` dataset. It classifies individual sentences from biomedical abstracts into one of five standard sections:
19
+
20
+ - Background
21
+ - Objective
22
+ - Methods
23
+ - Results
24
+ - Conclusion
25
+
26
+ This model is useful for tasks requiring structured understanding or summarization of scientific literature.
27
+
28
+ ## Training Details
29
+
30
+ - Base Model: `gpt2` (124M parameters)
31
+ - Dataset: `pietrolesci/pubmed-200k-rct`
32
+ - Task: Sentence classification
33
+ - Labels: Background, Objective, Methods, Results, Conclusion
34
+ - Epochs: 1 (partial training)
35
+ - Loss Function: CrossEntropy
36
+ - Optimizer: AdamW