OPEA
/

GGUF
Inference Endpoints
conversational
cicdatopea commited on
Commit
26a3afc
·
verified ·
1 Parent(s): 0c4f21a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -7,7 +7,7 @@ base_model:
7
 
8
  ## Model Details
9
 
10
- This awq model is an int4 model with group_size 32 and symmetric quantization of [PowerInfer/SmallThinker-3B-Preview](https://huggingface.co/PowerInfer/SmallThinker-3B-Preview) generated by [intel/auto-round](https://github.com/intel/auto-round).
11
 
12
  ## How To Use
13
  ### Requirements
 
7
 
8
  ## Model Details
9
 
10
+ This gguf model is an int4 model with group_size 32 and symmetric quantization of [PowerInfer/SmallThinker-3B-Preview](https://huggingface.co/PowerInfer/SmallThinker-3B-Preview) generated by [intel/auto-round](https://github.com/intel/auto-round).
11
 
12
  ## How To Use
13
  ### Requirements