Xenova HF staff commited on
Commit
67c4177
·
verified ·
1 Parent(s): c0afc74

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +30 -0
README.md CHANGED
@@ -5,4 +5,34 @@ base_model: google/gemma-3-1b-it
5
 
6
  https://huggingface.co/google/gemma-3-1b-it with ONNX weights to be compatible with Transformers.js.
7
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
8
  Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using [🤗 Optimum](https://huggingface.co/docs/optimum/index) and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).
 
5
 
6
  https://huggingface.co/google/gemma-3-1b-it with ONNX weights to be compatible with Transformers.js.
7
 
8
+
9
+ ### Transformers.js
10
+
11
+ If you haven't already, you can install the [Transformers.js](https://huggingface.co/docs/transformers.js) JavaScript library from [NPM](https://www.npmjs.com/package/@huggingface/transformers) using:
12
+ ```bash
13
+ npm i @huggingface/transformers
14
+ ```
15
+
16
+ You can then use the model like this:
17
+ ```js
18
+ import { pipeline } from "@huggingface/transformers";
19
+
20
+ // Create a text generation pipeline
21
+ const generator = await pipeline(
22
+ "text-generation",
23
+ "onnx-community/gemma-3-1b-it-ONNX",
24
+ { dtype: "q4" },
25
+ );
26
+
27
+ // Define the list of messages
28
+ const messages = [
29
+ { role: "system", content: "You are a helpful assistant." },
30
+ { role: "user", content: "Write me a poem about Machine Learning." },
31
+ ];
32
+
33
+ // Generate a response
34
+ const output = await generator(messages, { max_new_tokens: 512, do_sample: false });
35
+ console.log(output[0].generated_text.at(-1).content);
36
+ ```
37
+
38
  Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using [🤗 Optimum](https://huggingface.co/docs/optimum/index) and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).