tuandunghcmut commited on
Commit
c76faaf
·
verified ·
1 Parent(s): a4a2693

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -4
README.md CHANGED
@@ -9,8 +9,6 @@ tags:
9
  - trl
10
  - sft
11
  licence: license
12
- datasets:
13
- - ura-hcmut/Vietnamese-Customer-Support-QA
14
  ---
15
 
16
  # Model Card for Qwen3-FT-MyDataset
@@ -18,7 +16,7 @@ datasets:
18
  This model is a fine-tuned version of [Qwen/Qwen3-0.6B](https://huggingface.co/Qwen/Qwen3-0.6B).
19
  It has been trained using [TRL](https://github.com/huggingface/trl).
20
 
21
- ## Quick start
22
 
23
  ```python
24
  from transformers import pipeline
@@ -27,7 +25,7 @@ question = "If you had a time machine, but could only go to the past or the futu
27
  generator = pipeline("text-generation", model="tuandunghcmut/Qwen3-FT-MyDataset", device="cuda")
28
  output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
29
  print(output["generated_text"])
30
- ```
31
 
32
  ## Training procedure
33
 
 
9
  - trl
10
  - sft
11
  licence: license
 
 
12
  ---
13
 
14
  # Model Card for Qwen3-FT-MyDataset
 
16
  This model is a fine-tuned version of [Qwen/Qwen3-0.6B](https://huggingface.co/Qwen/Qwen3-0.6B).
17
  It has been trained using [TRL](https://github.com/huggingface/trl).
18
 
19
+ <!-- ## Quick start
20
 
21
  ```python
22
  from transformers import pipeline
 
25
  generator = pipeline("text-generation", model="tuandunghcmut/Qwen3-FT-MyDataset", device="cuda")
26
  output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
27
  print(output["generated_text"])
28
+ ``` -->
29
 
30
  ## Training procedure
31