Update README.md
Browse files
README.md
CHANGED
@@ -27,21 +27,21 @@ English, German, Spanish, French, Japanese, Portuguese, Arabic, Czech, Italian,
|
|
27 |
Prominent use cases of LLMs in text-to-text generation include summarization, text classification, extraction, question-answering, and other long-context tasks. All Granite Base models are able to handle these tasks as they were trained on a large amount of data from various domains. Moreover, they can serve as baseline to create specialized models for specific application scenarios.
|
28 |
|
29 |
**Installation:**
|
30 |
-
|
31 |
<!-- This is a simple example of how to use Granite-4.0-Tiny-Base-Preview model. -->
|
32 |
|
33 |
<!-- Usage: Install transformer from source or use transformer version v4.45 to use this checkpoint. -->
|
34 |
-
<!-- We have a huggingface PR which is yet to be merged.
|
35 |
-
HuggingFace PR: https://github.com/huggingface/transformers/pull/37658
|
36 |
|
37 |
-
|
38 |
-
While the native support of this model in Hugging Face Transformers is pending ([PR](https://github.com/huggingface/transformers/pull/37658)), you need to install transformers from the following source to use this model:
|
39 |
```shell
|
40 |
git clone https://github.com/Ssukriti/transformers.git
|
41 |
cd transformers
|
42 |
git checkout granitemoe_hybrid_external_cleanup
|
43 |
pip install -e .
|
44 |
-
```
|
45 |
<!-- Install the following libraries:
|
46 |
|
47 |
```shell
|
|
|
27 |
Prominent use cases of LLMs in text-to-text generation include summarization, text classification, extraction, question-answering, and other long-context tasks. All Granite Base models are able to handle these tasks as they were trained on a large amount of data from various domains. Moreover, they can serve as baseline to create specialized models for specific application scenarios.
|
28 |
|
29 |
**Installation:**
|
30 |
+
You need to install transformer from source to use this checkpoint.
|
31 |
<!-- This is a simple example of how to use Granite-4.0-Tiny-Base-Preview model. -->
|
32 |
|
33 |
<!-- Usage: Install transformer from source or use transformer version v4.45 to use this checkpoint. -->
|
34 |
+
<!-- We have a huggingface PR which is yet to be merged. -->
|
35 |
+
HuggingFace PR: https://github.com/huggingface/transformers/pull/37658
|
36 |
|
37 |
+
Install transformer from source: https://huggingface.co/docs/transformers/en/installation#install-from-source
|
38 |
+
<!-- While the native support of this model in Hugging Face Transformers is pending ([PR](https://github.com/huggingface/transformers/pull/37658)), you need to install transformers from the following source to use this model:
|
39 |
```shell
|
40 |
git clone https://github.com/Ssukriti/transformers.git
|
41 |
cd transformers
|
42 |
git checkout granitemoe_hybrid_external_cleanup
|
43 |
pip install -e .
|
44 |
+
``` -->
|
45 |
<!-- Install the following libraries:
|
46 |
|
47 |
```shell
|