license: apache-2.0 | |
base_model: | |
- shisa-ai/shisa-v2-qwen2.5-7b | |
language: | |
- ja | |
- en | |
Quantized w/ [llm-compressor](https://github.com/vllm-project/llm-compressor) | |
license: apache-2.0 | |
base_model: | |
- shisa-ai/shisa-v2-qwen2.5-7b | |
language: | |
- ja | |
- en | |
Quantized w/ [llm-compressor](https://github.com/vllm-project/llm-compressor) | |