Update README.md
Browse files
README.md
CHANGED
@@ -12,6 +12,9 @@ experimental seq2seq with EncoderDecoderModel. You will need to patch `modeling_
|
|
12 |
> [!WARNING]
|
13 |
> WIP + output of this model is gibberish bc cross attn needs training
|
14 |
|
|
|
|
|
|
|
15 |
```py
|
16 |
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
|
17 |
|
|
|
12 |
> [!WARNING]
|
13 |
> WIP + output of this model is gibberish bc cross attn needs training
|
14 |
|
15 |
+
uses different configuration token ids than the [first one](https://huggingface.co/pszemraj/ModernBERT2Olmo-large_1b-test) + uses [olmo-1-b-0724](https://huggingface.co/allenai/OLMo-1B-0724-hf) for decoder
|
16 |
+
|
17 |
+
|
18 |
```py
|
19 |
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
|
20 |
|