demo.py download model despite --ckpt passed the path
#4
by
CHNtentes
- opened
This is my command:
v2x@v2x-OMEN-Desktop:~/ltg/Lumina-T2X/lumina_next_t2i$ python -u demo.py --ckpt ../lumina_next_sft/ --ema
In ../lumina_next_sft directory, I already downloaded model in this repo using huggingface-cli.
However, it prints:
/home/v2x/miniconda3/envs/executorch/lib/python3.12/site-packages/diffusers/models/transformers/transformer_2d.py:34: FutureWarning: `Transformer2DModelOutput` is deprecated and will be removed in version 1.0.0. Importing `Transformer2DModelOutput` from `diffusers.models.transformer_2d` is deprecated and this will be removed in a future version. Please use `from diffusers.models.modeling_outputs import Transformer2DModelOutput`, instead.
deprecate("Transformer2DModelOutput", "1.0.0", deprecation_message)
> initializing model parallel with size 1
> initializing ddp with size 1
> initializing pipeline with size 1
Loaded model arguments: {
"model": "NextDiT_2B_GQA_patch2",
"image_size": 1024,
"vae": "sdxl",
"precision": "bf16",
"grad_precision": "fp32",
"grad_clip": 2.0,
"wd": 0.0,
"qk_norm": true,
"model_parallel_size": 1
}
Creating lm: Gemma-2B
/home/v2x/miniconda3/envs/executorch/lib/python3.12/site-packages/huggingface_hub/file_download.py:1132: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`.
warnings.warn(
Downloading shards: 0%| | 0/2 [00:00<?, ?it/s]
model-00001-of-00002.safetensors: 0%| | 0.00/4.95G [00:00<?, ?B/s]
What model is being downloaded here?
Hi
@CHNtentes
,
you need to download the Gemma-2b model from huggingface-cli in advance.
PommesPeter
changed discussion status to
closed