VRAM USAGE
#6
by
AhmadDadash
- opened
can i use this model in rtx 4090 24 gb vram?
You could have a try. The model size is ~16GB
i download this model into local folder when i run the code
error:
ValueError: Cannot use apply_chat_template because this processor does not have a chat template.
I could not reproduce this issue.
To run locally, the only required change is modifying the load_system_prompt function as follows or changing the load_system_prompt code
SYSTEM_PROMPT = load_system_prompt("OPEA/Mistral-Small-3.1-24B-Instruct-2503-int4-AutoRound-awq-sym", "SYSTEM_PROMPT.txt")