cann't convert to GGUF format

#3
by jeffzhou2000 - opened

python convert_hf_to_gguf.py ~/LLM/MiMo-7B-RL-0530

INFO:hf-to-gguf:Loading model: MiMo-7B-RL-0530
WARNING:hf-to-gguf:Failed to load model config from ~/LLM/MiMo-7B-RL-0530: Loading ~/LLM/MiMo-7B-RL-0530 requires you to execute the configuration file in that repo on your local machine. Make sure you have read the code there to avoid malicious use, then set the option trust_remote_code=True to remove this error.
WARNING:hf-to-gguf:Trying to load config.json instead
INFO:hf-to-gguf:Model architecture: MiMoForCausalLM
ERROR:hf-to-gguf:Model MiMoForCausalLM is not supported

Sign up or log in to comment