File size: 787 Bytes
09bb874 b8757a8 d6ae002 d73f03f d6ae002 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 |
---
license: apache-2.0
language:
- en
---
## Model weights for Parallel Roberta-Large model ##
To use this model, you need to use the following [modeling_roberta.py](https://github.com/luffycodes/Parallel-Transformers-Pytorch/blob/main/paf_modeling_roberta.py) file.
If you use this work, please cite:
Investigating the Role of Feed-Forward Networks in Transformers Using Parallel Attention and Feed-Forward Net Design:
https://arxiv.org/abs/2305.13297
```
@misc{sonkar2023investigating,
title={Investigating the Role of Feed-Forward Networks in Transformers Using Parallel Attention and Feed-Forward Net Design},
author={Shashank Sonkar and Richard G. Baraniuk},
year={2023},
eprint={2305.13297},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|