llama4 not supported. which version required?

#1
by Weiguo - opened

/miniconda3/lib/python3.12/site-packages/mlx_vlm/utils.py", line 68, in get_model_and_args
raise ValueError(msg)
ValueError: Model type llama4 not supported.

ERROR:root:Model type llama4 not supported.
Traceback (most recent call last):
File "/opt/miniconda3/envs/mlx/lib/python3.12/site-packages/mlx_lm/utils.py", line 71, in _get_classes
arch = importlib.import_module(f"mlx_lm.models.{model_type}")
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Your need to confirm your account before you can post a new comment.

Sign up or log in to comment