Finetuned T5 and T5 Long models using LoRA on pubmed article from the scientific_papers dataset
Daniel Solomon
dsolomon
AI & ML interests
None yet
Organizations
None yet
models
12
dsolomon/long-t5-global-pubmed-LoRA-r4-i1024-o128
Updated
•
19
dsolomon/long-t5-global-pubmed-LoRA-r4-i512-o128
Updated
•
11
dsolomon/long-t5-local-pubmed-LoRA-r4-i1024-o128
Updated
•
2
dsolomon/t5-base-pubmed-LoRA-r4-i1024-o128
Updated
dsolomon/t5-base-pubmed-LoRA-r4-i512-o128
Updated
dsolomon/t5-small-pubmed-LoRA-r4-i512-o128
Updated
•
7
dsolomon/t5-small-pubmed-LoRA-r4-i1024-o128
Updated
•
13
dsolomon/long-t5-base-tglobal-LORA-pubmed-r4-1000-100
Updated
•
10
dsolomon/t5long-tglobal-base-safetensors
Text Generation
•
0.2B
•
Updated
•
6
dsolomon/long-t5-base-global-pubmed-r4
Updated
•
11
datasets
0
None public yet