Update README.md with new model card content
Browse files
README.md
CHANGED
|
@@ -7,7 +7,7 @@ tags:
|
|
| 7 |
- text-classification
|
| 8 |
pipeline_tag: text-classification
|
| 9 |
---
|
| 10 |
-
|
| 11 |
DistilBert is a set of language models published by HuggingFace. They are efficient, distilled version of BERT, and are intended for classification and embedding of text, not for text-generation. See the model card below for benchmarks, data sources, and intended use cases.
|
| 12 |
|
| 13 |
Weights and Keras model code are released under the [Apache 2 License](https://github.com/keras-team/keras-hub/blob/master/LICENSE).
|
|
@@ -148,4 +148,4 @@ classifier = keras_hub.models.DistilBertClassifier.from_preset(
|
|
| 148 |
preprocessor=None,
|
| 149 |
)
|
| 150 |
classifier.fit(x=features, y=labels, batch_size=2)
|
| 151 |
-
```
|
|
|
|
| 7 |
- text-classification
|
| 8 |
pipeline_tag: text-classification
|
| 9 |
---
|
| 10 |
+
### Model Overview
|
| 11 |
DistilBert is a set of language models published by HuggingFace. They are efficient, distilled version of BERT, and are intended for classification and embedding of text, not for text-generation. See the model card below for benchmarks, data sources, and intended use cases.
|
| 12 |
|
| 13 |
Weights and Keras model code are released under the [Apache 2 License](https://github.com/keras-team/keras-hub/blob/master/LICENSE).
|
|
|
|
| 148 |
preprocessor=None,
|
| 149 |
)
|
| 150 |
classifier.fit(x=features, y=labels, batch_size=2)
|
| 151 |
+
```
|