license: mit | |
Tokenizer, paper and detailed README is ongoing. But for the model, it can be loaded as follows: | |
```python | |
from transformers import AutoModelForCausalLM | |
model = AutoModelForCausalLM.from_pretrained("aria-project/medium-e75-base-padded", trust_remote_code=True) | |
``` | |
Can you guess what it is? ;) |