klei1 commited on
Commit
480d47e
·
verified ·
1 Parent(s): babb460

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -7
README.md CHANGED
@@ -94,13 +94,7 @@ This dataset was adapted for Albanian language training to ensure the model can
94
 
95
  ## Limitations
96
 
97
- The current model is an 8-bit quantized version of the 27B parameter model. This quantization offers advantages in terms of size and speed, but comes with some limitations:
98
-
99
- - Reduced precision compared to the original 16-bit or 32-bit model
100
- - May exhibit occasional numerical instabilities in complex reasoning chains
101
- - While optimized for logical reasoning in Albanian, complex or ambiguous problems may produce inconsistent results
102
- - As with all language models, it may occasionally hallucinate or provide incorrect information
103
- - Performance may vary depending on the complexity and clarity of the input prompts
104
 
105
  ## Acknowledgments
106
  - Google for developing the Gemma 3 architecture
 
94
 
95
  ## Limitations
96
 
97
+ The current model is an 8-bit quantized version of the 27B parameter model. It can run on much lower specifications, but at the cost of some performance.
 
 
 
 
 
 
98
 
99
  ## Acknowledgments
100
  - Google for developing the Gemma 3 architecture