facing an os error when ever i am trying to load
#2
by
sumukhsankarshana
- opened
OSError: zamal/Molmo-7B-GPTQ-4bit does not appear to have a file named modeling_molmo.py. Checkout 'https://huggingface.co/zamal/Molmo-7B-GPTQ-4bit/tree/main' for available files.
any idea on what i have to do , because the step to load the model and how to use it in unclear, getting similar errors when i try to substitute the names of the model in the original code provided by molmo
AttributeError: Model MolmoForCausalLM does not support BitsAndBytes quantization yet.